all AI news
Attentive Task Interaction Network for Multi-Task Learning. (arXiv:2201.10649v1 [cs.CV])
Web: http://arxiv.org/abs/2201.10649
Jan. 27, 2022, 2:10 a.m. | Dimitrios Sinodinos, Narges Armanfard
cs.LG updates on arXiv.org arxiv.org
Multitask learning (MTL) has recently gained a lot of popularity as a
learning paradigm that can lead to improved per-task performance while also
using fewer per-task model parameters compared to single task learning. One of
the biggest challenges regarding MTL networks involves how to share features
across tasks. To address this challenge, we propose the Attentive Task
Interaction Network (ATI-Net). ATI-Net employs knowledge distillation of the
latent features for each task, then combines the feature maps to provide
improved contextualized …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Data Scientist
@ Fluent, LLC | Boca Raton, Florida, United States
Big Data ETL Engineer
@ Binance.US | Vancouver
Data Scientist / Data Engineer
@ Kin + Carta | Chicago
Data Engineer
@ Craft | Warsaw, Masovian Voivodeship, Poland
Senior Manager, Data Analytics Audit
@ Affirm | Remote US
Data Scientist - Nationwide Opportunities, AWS Professional Services
@ Amazon.com | US, NC, Virtual Location - N Carolina