March 27, 2024, 4:47 a.m. | Julio Silva-Rodr\'iguez, Sina Hajimiri, Ismail Ben Ayed, Jose Dolz

cs.CV updates on arXiv.org arxiv.org

arXiv:2312.12730v2 Announce Type: replace
Abstract: Efficient transfer learning (ETL) is receiving increasing attention to adapt large pre-trained language-vision models on downstream tasks with a few labeled samples. While significant progress has been made, we reveal that state-of-the-art ETL approaches exhibit strong performance only in narrowly-defined experimental setups, and with a careful adjustment of hyperparameters based on a large corpus of labeled samples. In particular, we make two interesting, and surprising empirical observations. First, to outperform a simple Linear Probing baseline, …

abstract adapt art arxiv attention closer look cs.cv etl experimental few-shot language language models look performance progress samples state tasks transfer transfer learning type vision vision-language models vision models

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

AIML - Sr Machine Learning Engineer, Data and ML Innovation

@ Apple | Seattle, WA, United States

Senior Data Engineer

@ Palta | Palta Cyprus, Palta Warsaw, Palta remote