πŸ˜ƒ Basics
🧠 Advanced
πŸ”“ Prompt Hacking

Jailbreaking

Reading Time: 1 minute
Last updated on November 12, 2024

Valeriia Kuka

Jailbreaking is the act of getting a GenAI model to perform or produce unintended outputs through specific prompts. We also wrote a whole article about what Jailbreaking is and how it is different from Prompt Injection: Prompt Injection VS Jailbreaking: What is the difference?

Valeriia Kuka

Valeriia Kuka, Head of Content at Learn Prompting, is passionate about making AI and ML accessible. Valeriia previously grew a 60K+ follower AI-focused social media account, earning reposts from Stanford NLP, Amazon Research, Hugging Face, and AI researchers. She has also worked with AI/ML newsletters and global communities with 100K+ members and authored clear and concise explainers and historical articles.