May 22, 2023, 2:31 p.m. | /u/ofirpress

Machine Learning www.reddit.com

We just put a paper up where we found a wide array of questions that lead GPT-4 & ChatGPT to hallucinate so badly, to where in a separate chat session they can point out that what they previously said was incorrect.

​

We call these hallucinations snowballed hallucinations.

​

[Turn sound ON to watch our demo](https://reddit.com/link/13oskli/video/fqm405jz7e1b1/player)

The paper is here: [https://ofir.io/snowballed\_hallucination.pdf](https://ofir.io/snowballed_hallucination.pdf)

There's a summary on Twitter here: [https://twitter.com/OfirPress/status/1660646315049533446](https://twitter.com/OfirPress/status/1660646315049533446)

​

I'll be here to answer your questions :)

call chat chatgpt gpt gpt-4 hallucinations machinelearning paper questions session

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Healthcare Data Modeler/Data Architect - REMOTE

@ Perficient | United States

Data Analyst – Sustainability, Green IT

@ H&M Group | Stockholm, Sweden

RWE Data Analyst

@ Sanofi | Hyderabad

Machine Learning Engineer

@ JPMorgan Chase & Co. | Jersey City, NJ, United States