May 6, 2024, 5:43 a.m. | /u/sunchipsster

Machine Learning www.reddit.com

Hello folks! See post here: [https://twitter.com/abacaj/status/1785147493728039111](https://twitter.com/abacaj/status/1785147493728039111)

I didn't understand what he meant by "with zero-training (actually just a simple 2 line config) you can get 32k context out of llama-3 models"

Does someone know what this **dynamic scaling trick** is? Much appreciated! :)

32k context config context dynamic line llama machinelearning scaling simple training trick

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US