Dec. 5, 2022, 4:01 p.m. | Together

Blog Content - TOGETHER www.together.xyz

We introduce a compression technique that can train models on networks with
20X less bandwidth with minimal impact on training time. Another step
toward democratizing the training & tuning of foundation models!

communication compression decentralized foundation impact networks neurips neurips 2022 research training

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York