March 12, 2024, 4:45 a.m. | Aakash Lahoti, Spandan Senapati, Ketan Rajawat, Alec Koppel

cs.LG updates on arXiv.org arxiv.org

arXiv:2305.17283v2 Announce Type: replace-cross
Abstract: The problem of minimizing the sum of $n$ functions in $d$ dimensions is ubiquitous in machine learning and statistics. In many applications where the number of observations $n$ is large, it is necessary to use incremental or stochastic methods, as their per-iteration cost is independent of $n$. Of these, Quasi-Newton (QN) methods strike a balance between the per-iteration cost and the convergence rate. Specifically, they exhibit a superlinear rate with $O(d^2)$ cost in contrast to …

abstract applications arxiv cost cs.lg dimensions functions incremental independent iteration lazy machine machine learning math.oc per statistics stochastic type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Research Scientist

@ Meta | Menlo Park, CA

Principal Data Scientist

@ Mastercard | O'Fallon, Missouri (Main Campus)