all AI news
Prompt injection and jailbreaking are not the same thing
Simon Willison's Weblog simonwillison.net
I keep seeing people use the term "prompt injection" when they're actually talking about "jailbreaking".
This mistake is so common now that I'm not sure it's possible to correct course: language meaning (especially for recently coined terms) comes from how that language is used. I'm going to try anyway, because I think the distinction really matters.
Definitions
Prompt injection is a class of attacks against applications built on top of Large Language Models (LLMs) that work by concatenating untrusted user …
ai course generativeai jailbreak jailbreaking language llms meaning people prompt prompt injection promptinjection security terms