all AI news
Bridging Diversity and Uncertainty in Active learning with Self-Supervised Pre-Training
March 7, 2024, 5:41 a.m. | Paul Doucet, Benjamin Estermann, Till Aczel, Roger Wattenhofer
cs.LG updates on arXiv.org arxiv.org
Abstract: This study addresses the integration of diversity-based and uncertainty-based sampling strategies in active learning, particularly within the context of self-supervised pre-trained models. We introduce a straightforward heuristic called TCM that mitigates the cold start problem while maintaining strong performance across various data levels. By initially applying TypiClust for diversity sampling and subsequently transitioning to uncertainty sampling with Margin, our approach effectively combines the strengths of both strategies. Our experiments demonstrate that TCM consistently outperforms existing …
abstract active learning arxiv cold start context cs.ai cs.cv cs.lg data diversity integration performance pre-trained models pre-training sampling strategies study training type uncertainty
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Data Scientist, Mid
@ Booz Allen Hamilton | DEU, Stuttgart (Kurmaecker St)
Tech Excellence Data Scientist
@ Booz Allen Hamilton | Undisclosed Location - USA, VA, Mclean