all AI news
Google NotebookLM Data Exfiltration
Simon Willison's Weblog simonwillison.net
Google NotebookLM Data Exfiltration
NotebookLM is a Google Labs product that lets you store information as sources (mainly text files in PDF) and then ask questions against those sources - effectively an interface for building your own custom RAG (Retrieval Augmented Generation) chatbots.
Unsurprisingly for anything that allows LLMs to interact with untrusted documents, it's susceptible to prompt injection.
Johann Rehberger found some classic prompt injection exfiltration attacks: you can create source documents with instructions that cause the chatbot to …
ai building chatbots data documents files generativeai google google notebooklm information labs llms notebooklm pdf product promptinjection questions rag retrieval retrieval augmented generation security store text