Jan. 2, 2024, 11:09 a.m. | /u/APaperADay

Machine Learning www.reddit.com

**Paper**: [https://arxiv.org/abs/2312.07413](https://arxiv.org/abs/2312.07413)

**Blog post**: [https://epochai.org/blog/ai-capabilities-can-be-significantly-improved-without-expensive-retraining](https://epochai.org/blog/ai-capabilities-can-be-significantly-improved-without-expensive-retraining)

**Abstract**:

>State-of-the-art AI systems can be significantly improved without expensive retraining via "post-training enhancements"-techniques applied after initial training like fine-tuning the system to use a web browser. We review recent post-training enhancements, categorizing them into five types: tool-use, prompting methods, scaffolding, solution selection, and data generation. Different enhancements improve performance on different tasks, making it hard to compare their significance. So we translate improvements from different enhancements into a common currency, the compute-equivalent gain: how …

abstract ai systems art browser data fine-tuning five machinelearning making performance prompting retraining review significance solution state systems tasks them tool training types via web web browser

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York