Free Live Workshop: Vibe Coding with Google AI Studio — April 1

Register now →

Jailbreaking

Last updated on November 12, 2024

Jailbreaking is the act of getting a GenAI model to perform or produce unintended outputs through specific prompts. We also wrote a whole article about what Jailbreaking is and how it is different from Prompt Injection: Prompt Injection VS Jailbreaking: What is the difference?


© 2026 Learn Prompting. All rights reserved.