Feb. 14, 2024, 8:22 p.m. | /u/FastestGPU

Machine Learning www.reddit.com

**Paper**: [https://arxiv.org/abs/2402.07712](https://arxiv.org/abs/2402.07712)

**Abstract**:

>In the era of large language models like ChatGPT, the phenomenon of "model collapse" refers to the situation whereby as a model is trained recursively on data generated from previous generations of itself over time, its performance degrades until the model eventually becomes completely useless, i.e the model collapses. In this work, we study this phenomenon in the simplified setting of kernel regression and obtain results which show a clear crossover between where the model can cope …

abstract chatgpt data eventually generated language language models large language large language models machinelearning model collapse performance simplified study work

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote