all AI news
On Enforcing Better Conditioned Meta-Learning for Rapid Few-Shot Adaptation. (arXiv:2206.07260v1 [cs.LG])
Web: http://arxiv.org/abs/2206.07260
June 16, 2022, 1:10 a.m. | Markus Hiller, Mehrtash Harandi, Tom Drummond
cs.LG updates on arXiv.org arxiv.org
Inspired by the concept of preconditioning, we propose a novel method to
increase adaptation speed for gradient-based meta-learning methods without
incurring extra parameters. We demonstrate that recasting the optimization
problem to a non-linear least-squares formulation provides a principled way to
actively enforce a $\textit{well-conditioned}$ parameter space for
meta-learning models based on the concepts of the condition number and local
curvature. Our comprehensive evaluations show that the proposed method
significantly outperforms its unconstrained counterpart especially during
initial adaptation steps, while achieving …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY