all AI news
Self-Taught Optimizer (STOP): Recursively Self-Improving Code Generation
March 4, 2024, 5:43 a.m. | Eric Zelikman, Eliana Lorch, Lester Mackey, Adam Tauman Kalai
cs.LG updates on arXiv.org arxiv.org
Abstract: Several recent advances in AI systems (e.g., Tree-of-Thoughts and Program-Aided Language Models) solve problems by providing a "scaffolding" program that structures multiple calls to language models to generate better outputs. A scaffolding program is written in a programming language such as Python. In this work, we use a language-model-infused scaffolding program to improve itself. We start with a seed "improver" that improves an input program according to a given utility function by querying a language …
abstract advances ai systems arxiv code code generation cs.ai cs.cl cs.lg generate language language models multiple programming programming language python solve stat.ml systems thoughts tree type work
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote