March 28, 2024, 4:48 a.m. | Mateusz Klimaszewski, Piotr Andruszkiewicz, Alexandra Birch

cs.CL updates on arXiv.org arxiv.org

arXiv:2403.18804v1 Announce Type: new
Abstract: The rise of Modular Deep Learning showcases its potential in various Natural Language Processing applications. Parameter-efficient fine-tuning (PEFT) modularity has been shown to work for various use cases, from domain adaptation to multilingual setups. However, all this work covers the case where the modular components are trained and deployed within one single Pre-trained Language Model (PLM). This model-specific setup is a substantial limitation on the very modularity that modular architectures are trying to achieve. We …

abstract applications arxiv case cases case study cs.cl deep learning distillation domain domain adaptation fine-tuning however knowledge language language processing modular multilingual natural natural language natural language processing peft processing study through type use cases work

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne