April 13, 2022, 1:11 a.m. | Jaime Spencer, Richard Bowden, Simon Hadfield

cs.LG updates on arXiv.org arxiv.org

Recent approaches to multi-task learning (MTL) have focused on modelling
connections between tasks at the decoder level. This leads to a tight coupling
between tasks, which need retraining if a new task is inserted or removed. We
argue that MTL is a stepping stone towards universal feature learning (UFL),
which is the ability to learn generic features that can be applied to new tasks
without retraining.


We propose Medusa to realize this goal, designing task heads with dual
attention mechanisms. …

arxiv learning multitasking

AI Focused Biochemistry Postdoctoral Fellow

@ Lawrence Berkeley National Lab | Berkeley, CA

Senior Data Engineer

@ Displate | Warsaw

Senior Backend Eng for the Cloud Team - Yehud or Haifa

@ Vayyar | Yehud, Center District, Israel

Business Applications Administrator (Google Workspace)

@ Allegro | Poznań, Poland

Backend Development Technical Lead (Demand Solutions) (f/m/d)

@ adjoe | Hamburg, Germany

Front-end Engineer

@ Cognite | Bengaluru