all AI news
Learning Multi-Tasks with Inconsistent Labels by using Auxiliary Big Task. (arXiv:2201.02305v1 [cs.LG])
Jan. 10, 2022, 2:10 a.m. | Quan Feng, Songcan Chen
cs.LG updates on arXiv.org arxiv.org
Multi-task learning is to improve the performance of the model by
transferring and exploiting common knowledge among tasks. Existing MTL works
mainly focus on the scenario where label sets among multiple tasks (MTs) are
usually the same, thus they can be utilized for learning across the tasks.
While almost rare works explore the scenario where each task only has a small
amount of training samples, and their label sets are just partially overlapped
or even not. Learning such MTs is …
More from arxiv.org / cs.LG updates on arXiv.org
Generalized Schr\"odinger Bridge Matching
1 day, 2 hours ago |
arxiv.org
Tight bounds on Pauli channel learning without entanglement
1 day, 2 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Analyst
@ SEAKR Engineering | Englewood, CO, United States
Data Analyst II
@ Postman | Bengaluru, India
Data Architect
@ FORSEVEN | Warwick, GB
Director, Data Science
@ Visa | Washington, DC, United States
Senior Manager, Data Science - Emerging ML
@ Capital One | McLean, VA