Web: http://arxiv.org/abs/2201.10649

Jan. 27, 2022, 2:10 a.m. | Dimitrios Sinodinos, Narges Armanfard

cs.LG updates on arXiv.org arxiv.org

Multitask learning (MTL) has recently gained a lot of popularity as a
learning paradigm that can lead to improved per-task performance while also
using fewer per-task model parameters compared to single task learning. One of
the biggest challenges regarding MTL networks involves how to share features
across tasks. To address this challenge, we propose the Attentive Task
Interaction Network (ATI-Net). ATI-Net employs knowledge distillation of the
latent features for each task, then combines the feature maps to provide
improved contextualized …

arxiv cv learning multi-task learning network

More from arxiv.org / cs.LG updates on arXiv.org

Data Scientist

@ Fluent, LLC | Boca Raton, Florida, United States

Big Data ETL Engineer

@ Binance.US | Vancouver

Data Scientist / Data Engineer

@ Kin + Carta | Chicago

Data Engineer

@ Craft | Warsaw, Masovian Voivodeship, Poland

Senior Manager, Data Analytics Audit

@ Affirm | Remote US

Data Scientist - Nationwide Opportunities, AWS Professional Services

@ Amazon.com | US, NC, Virtual Location - N Carolina