Web: http://arxiv.org/abs/2206.10745

June 24, 2022, 1:11 a.m. | Thomas O'Leary-Roseberry, Peng Chen, Umberto Villa, Omar Ghattas

cs.LG updates on arXiv.org arxiv.org

Neural operators have gained significant attention recently due to their
ability to approximate high-dimensional parametric maps between function
spaces. At present, only parametric function approximation has been addressed
in the neural operator literature. In this work we investigate incorporating
parametric derivative information in neural operator training; this information
can improve function approximations, additionally it can be used to improve the
approximation of the derivative with respect to the parameter, which is often
the key to scalable solution of high-dimensional outer-loop …

arxiv framework learning math neural parametric

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY