all AI news
Where Should I Spend My FLOPS? Efficiency Evaluations of Visual Pre-training Methods. (arXiv:2209.15589v3 [cs.CV] UPDATED)
Oct. 7, 2022, 1:16 a.m. | Skanda Koppula, Yazhe Li, Evan Shelhamer, Andrew Jaegle, Nikhil Parthasarathy, Relja Arandjelovic, João Carreira, Olivier Hénaff
cs.CV updates on arXiv.org arxiv.org
Self-supervised methods have achieved remarkable success in transfer
learning, often achieving the same or better accuracy than supervised
pre-training. Most prior work has done so by increasing pre-training
computation by adding complex data augmentation, multiple views, or lengthy
training schedules. In this work, we investigate a related, but orthogonal
question: given a fixed FLOP budget, what are the best datasets, models, and
(self-)supervised training methods for obtaining high accuracy on
representative visual tasks? Given the availability of large datasets, this …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Principal Engineer, Deep Learning
@ Outrider | Remote
Data Analyst (Bangkok based, relocation provided)
@ Agoda | Bangkok (Central World Office)
Data Scientist II
@ MoEngage | Bengaluru
Machine Learning Engineer
@ Sika AG | Welwyn Garden City, United Kingdom