Feb. 7, 2024, 5:42 a.m. | J. Jon Ryu Xiangxiang Xu H. S. Melihcan Erol Yuheng Bu Lizhong Zheng Gregory W. Wornell

cs.LG updates on arXiv.org arxiv.org

Computing eigenvalue decomposition (EVD) of a given linear operator, or finding its leading eigenvalues and eigenfunctions, is a fundamental task in many machine learning and scientific computing problems. For high-dimensional eigenvalue problems, training neural networks to parameterize the eigenfunctions is considered as a promising alternative to the classical numerical linear algebra techniques. This paper proposes a new optimization framework based on the low-rank approximation characterization of a truncated singular value decomposition, accompanied by new techniques called nesting for learning the …

algebra approximation computing cs.lg cs.na eigenvalue linear linear algebra low machine machine learning math.na networks neural networks numerical stat.ml svd training via

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Director, Clinical Data Science

@ Aura | Remote USA

Research Scientist, AI (PhD)

@ Meta | Menlo Park, CA | New York City