Develeb
Jobs
Resources
Events
About
Log In
Join
Jobs
Resources
Events
About
Log In
Join
Developer Resources
Jailbreak Cookbook
An encyclopedia of jailbreaking techniques to make AI models safer.
AI
LLM
Prompt Engineering
Jailbreak
System Design 101
Explain complex systems using visuals and simple terms. Help you prepare for system design interviews.
System Design
Interview Prep
Prompt Engineering Interactive Tutorial
Learn how to effectively prompt LLMs with this interactive step-by-step guide by anthropic, the creators of Claude.
AI
LLM
Prompt Engineering
What can agents actually do?
Great article by Will Larson, CTO at Imprint (formerly at Carta, Stripe, Uber), on what AI agents can and cannot do.
AI
LLM
Agentic AI
Page 2 of 2