April 3, 2024, 4:41 a.m. | Matthias Gerstgrasser, Rylan Schaeffer, Apratim Dey, Rafael Rafailov, Henry Sleight, John Hughes, Tomasz Korbak, Rajashree Agrawal, Dhruv Pai, Andrey

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.01413v1 Announce Type: new
Abstract: The proliferation of generative models, combined with pretraining on web-scale data, raises a timely question: what happens when these models are trained on their own generated outputs? Recent investigations into model-data feedback loops discovered that such loops can lead to model collapse, a phenomenon where performance progressively degrades with each model-fitting iteration until the latest model becomes useless. However, several recent papers studying model collapse assumed that new data replace old data over time rather …

abstract arxiv breaking cs.ai cs.cl cs.et cs.lg data feedback generated generative generative models investigations model collapse pretraining question raises recursion scale stat.ml synthetic synthetic data type web

Data Scientist (m/f/x/d)

@ Symanto Research GmbH & Co. KG | Spain, Germany

Data Analyst, Client Insights and Analytics - New Graduate, Full Time

@ Scotiabank | Toronto, ON, CA

Consultant Senior Data Scientist (H/F)

@ Publicis Groupe | Paris, France

Data Analyst H/F - CDI

@ Octapharma | Lingolsheim, FR

Lead AI Engineer

@ Ford Motor Company | United States

Senior Staff Machine Learning Engineer

@ Warner Bros. Discovery | CA San Francisco 153 Kearny Street