March 26, 2024, 4:49 a.m. | Qizhe Zhang, Bocheng Zou, Ruichuan An, Jiaming Liu, Shanghang Zhang

cs.CV updates on arXiv.org arxiv.org

arXiv:2312.02923v2 Announce Type: replace
Abstract: With the rapid growth in the scale of pre-trained foundation models, parameter-efficient fine-tuning techniques have gained significant attention, among which Adapter Tuning is the most widely used. Despite achieving efficiency, it still underperforms full fine-tuning, and the performance improves at the cost of an increase in parameters. Recent efforts have either focused on training multiple adapter experts to increase model capacity or on pruning adapters to achieve parameter efficiency. However, both approaches introduce more parameters …

abstract adapter arxiv attention cost cs.cv efficiency fine-tuning foundation growth performance scale type visual

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York