March 14, 2024, 4:46 a.m. | Yuxin Tian, Mouxing Yang, Yunfan Li, Dayiheng Liu, Xingzhang Ren, Xi Peng, Jiancheng Lv

cs.CV updates on arXiv.org arxiv.org

arXiv:2403.08433v1 Announce Type: new
Abstract: Recent studies applied Parameter Efficient Fine-Tuning techniques (PEFTs) to efficiently narrow the performance gap between pre-training and downstream. There are two important factors for various PEFTs, namely, the accessible data size and fine-tunable parameter size. A natural expectation for PEFTs is that the performance of various PEFTs is positively related to the data size and fine-tunable parameter size. However, according to the evaluation of five PEFTs on two downstream vision-language (VL) tasks, we find that …

abstract arxiv cs.cv data fine-tuning gap language narrow natural performance pre-training studies study train training type vision

Senior Data Engineer

@ Displate | Warsaw

Professor/Associate Professor of Health Informatics [LKCMedicine]

@ Nanyang Technological University | NTU Novena Campus, Singapore

Research Fellow (Computer Science (and Engineering)/Electronic Engineering/Applied Mathematics/Perception Sciences)

@ Nanyang Technological University | NTU Main Campus, Singapore

Java Developer - Assistant Manager

@ State Street | Bengaluru, India

Senior Java/Python Developer

@ General Motors | Austin IT Innovation Center North - Austin IT Innovation Center North

Research Associate (Computer Engineering/Computer Science/Electronics Engineering)

@ Nanyang Technological University | NTU Main Campus, Singapore