all AI news
Challenging Common Paradigms in Multi-Task Learning
March 28, 2024, 4:43 a.m. | Cathrin Elich, Lukas Kirchdorfer, Jan M. K\"ohler, Lukas Schott
cs.LG updates on arXiv.org arxiv.org
Abstract: While multi-task learning (MTL) has gained significant attention in recent years, its underlying mechanisms remain poorly understood. Recent methods did not yield consistent performance improvements over single task learning (STL) baselines, underscoring the importance of gaining more profound insights about challenges specific to MTL. In our study, we challenge paradigms in MTL in the context of STL: First, the impact of the choice of optimizer has only been mildly investigated in MTL. We show the …
abstract arxiv attention challenge challenges consistent cs.ai cs.cv cs.lg importance improvements insights multi-task learning performance stl study type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Lead Data Modeler
@ Sherwin-Williams | Cleveland, OH, United States