Oct. 3, 2023, 7:03 p.m. | /u/Singularian2501

Machine Learning www.reddit.com

Blog: [https://www.deepmind.com/blog/scaling-up-learning-across-many-different-robot-types](https://www.deepmind.com/blog/scaling-up-learning-across-many-different-robot-types)

[https://robotics-transformer-x.github.io/](https://robotics-transformer-x.github.io/) here you can also find the Datasets and Code!

Paper: [https://robotics-transformer-x.github.io/paper.pdf](https://robotics-transformer-x.github.io/paper.pdf)

Abstract:

>Large, high-capacity models trained on diverse datasets have shown remarkable successes on efficiently tackling downstream applications. In domains from NLP to Computer Vision, this has led to a consolidation of pretrained models, with general pretrained backbones serving as a starting point for many applications. Can such a consolidation happen in robotics? Conventionally, robotic learning methods train a separate model for every application, every robot, and …

abstract applications capacity computer computer vision consolidation datasets diverse domains general machinelearning nlp pretrained models robotic learning robotics vision

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote