all AI news
The Curse of Recursion: Training on Generated Data Makes Models Forget
April 16, 2024, 4:44 a.m. | Ilia Shumailov, Zakhar Shumaylov, Yiren Zhao, Yarin Gal, Nicolas Papernot, Ross Anderson
cs.LG updates on arXiv.org arxiv.org
Abstract: Stable Diffusion revolutionised image creation from descriptive text. GPT-2, GPT-3(.5) and GPT-4 demonstrated astonishing performance across a variety of language tasks. ChatGPT introduced such language models to the general public. It is now clear that large language models (LLMs) are here to stay, and will bring about drastic change in the whole ecosystem of online text and images. In this paper we consider what the future might hold. What will happen to GPT-{n} once LLMs …
abstract arxiv chatgpt clear cs.ai cs.cl cs.cr cs.cv cs.lg data diffusion general generated gpt gpt-2 gpt-3 gpt-4 image language language models large language large language models llms performance public recursion stable diffusion tasks text training type
More from arxiv.org / cs.LG updates on arXiv.org
The Perception-Robustness Tradeoff in Deterministic Image Restoration
2 days, 23 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Machine Learning Engineer
@ Apple | Sunnyvale, California, United States