all AI news
Asymmetry in Low-Rank Adapters of Foundation Models
Feb. 28, 2024, 5:43 a.m. | Jiacheng Zhu, Kristjan Greenewald, Kimia Nadjahi, Haitz S\'aez de Oc\'ariz Borde, Rickard Br\"uel Gabrielsson, Leshem Choshen, Marzyeh Ghassemi, Mikha
cs.LG updates on arXiv.org arxiv.org
Abstract: Parameter-efficient fine-tuning optimizes large, pre-trained foundation models by updating a subset of parameters; in this class, Low-Rank Adaptation (LoRA) is particularly effective. Inspired by an effort to investigate the different roles of LoRA matrices during fine-tuning, this paper characterizes and leverages unexpected asymmetry in the importance of low-rank adapter matrices. Specifically, when updating the parameter matrices of a neural network by adding a product $BA$, we observe that the $B$ and $A$ matrices have distinct …
abstract arxiv class cs.lg fine-tuning foundation importance lora low low-rank adaptation paper parameters roles type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US