s
April 1, 2024, 10:51 p.m. |

Simon Willison's Weblog simonwillison.net

Diving Deeper into AI Package Hallucinations


Bar Lanyado noticed that LLMs frequently hallucinate the names of packages that don't exist in their answers to coding questions, which can be exploited as a supply chain attack.


He gathered 2,500 questions across Python, Node.js, Go, .NET and Ruby and ran them through a number of different LLMs, taking notes of any hallucinated packages and if any of those hallucinations were repeated.


One repeat example was "pip install huggingface-cli" (the correct package is …

ai coding generativeai hallucinations llms node node.js package python questions ran ruby security supply chain them through

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Machine Learning Engineer

@ Samsara | Canada - Remote