all AI news
Attentive Task Interaction Network for Multi-Task Learning. (arXiv:2201.10649v1 [cs.CV])
Jan. 27, 2022, 2:10 a.m. | Dimitrios Sinodinos, Narges Armanfard
cs.LG updates on arXiv.org arxiv.org
Multitask learning (MTL) has recently gained a lot of popularity as a
learning paradigm that can lead to improved per-task performance while also
using fewer per-task model parameters compared to single task learning. One of
the biggest challenges regarding MTL networks involves how to share features
across tasks. To address this challenge, we propose the Attentive Task
Interaction Network (ATI-Net). ATI-Net employs knowledge distillation of the
latent features for each task, then combines the feature maps to provide
improved contextualized …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Principal Engineer, Deep Learning
@ Outrider | Remote
Data Analyst (Bangkok based, relocation provided)
@ Agoda | Bangkok (Central World Office)
Data Scientist II
@ MoEngage | Bengaluru
Machine Learning Engineer
@ Sika AG | Welwyn Garden City, United Kingdom