all AI news
Mixture of basis for interpretable continual learning with distribution shifts. (arXiv:2201.01853v1 [cs.LG])
Jan. 7, 2022, 2:10 a.m. | Mengda Xu, Sumitra Ganesh, Pranay Pasula
cs.LG updates on arXiv.org arxiv.org
Continual learning in environments with shifting data distributions is a
challenging problem with several real-world applications. In this paper we
consider settings in which the data distribution(task) shifts abruptly and the
timing of these shifts are not known. Furthermore, we consider a
semi-supervised task-agnostic setting in which the learning algorithm has
access to both task-segmented and unsegmented data for offline training. We
propose a novel approach called mixture of Basismodels (MoB) for addressing
this problem setting. The core idea is …
More from arxiv.org / cs.LG updates on arXiv.org
Regularization by Texts for Latent Diffusion Inverse Solvers
1 day, 7 hours ago |
arxiv.org
When can transformers reason with abstract symbols?
1 day, 7 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Scientist (m/f/x/d)
@ Symanto Research GmbH & Co. KG | Spain, Germany
Data Engineer
@ Paxos | Remote - United States
Data Analytics Specialist
@ Media.Monks | Kuala Lumpur
Software Engineer III- Pyspark
@ JPMorgan Chase & Co. | India
Engineering Manager, Data Infrastructure
@ Dropbox | Remote - Canada
Senior AI NLP Engineer
@ Hyro | Tel Aviv-Yafo, Tel Aviv District, Israel