all AI news
Diving Deeper into AI Package Hallucinations
Simon Willison's Weblog simonwillison.net
Diving Deeper into AI Package Hallucinations
Bar Lanyado noticed that LLMs frequently hallucinate the names of packages that don't exist in their answers to coding questions, which can be exploited as a supply chain attack.
He gathered 2,500 questions across Python, Node.js, Go, .NET and Ruby and ran them through a number of different LLMs, taking notes of any hallucinated packages and if any of those hallucinations were repeated.
One repeat example was "pip install huggingface-cli" (the correct package is …
ai coding generativeai hallucinations llms node node.js package python questions ran ruby security supply chain them through