all AI news
Low-Rank Adaptation for Multilingual Summarization: An Empirical Study
April 2, 2024, 7:45 p.m. | Chenxi Whitehouse, Fantine Huot, Jasmijn Bastings, Mostafa Dehghani, Chu-Cheng Lin, Mirella Lapata
cs.LG updates on arXiv.org arxiv.org
Abstract: Although the advancements of pre-trained Large Language Models have significantly accelerated recent progress in NLP, their ever-increasing size poses significant challenges for conventional fine-tuning, especially in memory-intensive tasks. We investigate the potential of Parameter-Efficient Fine-Tuning, focusing on Low-Rank Adaptation (LoRA), in the domain of multilingual summarization, a task that is both challenging (due to typically long inputs), and relatively unexplored. We conduct an extensive study across different data availability scenarios, including high- and low-data settings, …
abstract arxiv challenges cs.ai cs.cl cs.lg domain fine-tuning language language models large language large language models lora low low-rank adaptation memory multilingual nlp progress study summarization tasks type
More from arxiv.org / cs.LG updates on arXiv.org
Trainwreck: A damaging adversarial attack on image classifiers
1 day, 19 hours ago |
arxiv.org
Fast Controllable Diffusion Models for Undersampled MRI Reconstruction
1 day, 19 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
Sr. Data Operations
@ Carousell Group | West Jakarta, Indonesia
Senior Analyst, Business Intelligence & Reporting
@ Deutsche Bank | Bucharest
Business Intelligence Subject Matter Expert (SME) - Assistant Vice President
@ Deutsche Bank | Cary, 3000 CentreGreen Way
Enterprise Business Intelligence Specialist
@ NAIC | Kansas City
Senior Business Intelligence (BI) Developer - Associate
@ Deutsche Bank | Cary, 3000 CentreGreen Way