all AI news
Revisiting Weakly Supervised Pre-Training of Visual Perception Models. (arXiv:2201.08371v1 [cs.CV])
Jan. 21, 2022, 2:10 a.m. | Mannat Singh, Laura Gustafson, Aaron Adcock, Vinicius de Freitas Reis, Bugra Gedik, Raj Prateek Kosaraju, Dhruv Mahajan, Ross Girshick, Piotr Doll
cs.CV updates on arXiv.org arxiv.org
Model pre-training is a cornerstone of modern visual recognition systems.
Although fully supervised pre-training on datasets like ImageNet is still the
de-facto standard, recent studies suggest that large-scale weakly supervised
pre-training can outperform fully supervised approaches. This paper revisits
weakly-supervised pre-training of models using hashtag supervision with modern
versions of residual networks and the largest-ever dataset of images and
corresponding hashtags. We study the performance of the resulting models in
various transfer-learning settings including zero-shot transfer. We also
compare our …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Computer Vision Engineer
@ Motive | Pakistan - Remote
Data Analyst III
@ Fanatics | New York City, United States
Senior Data Scientist - Experian Health (This role is remote, from anywhere in the U.S.)
@ Experian | ., ., United States
Senior Data Engineer
@ Springer Nature Group | Pune, IN