April 17, 2023, 8:05 p.m. | Javier Burroni, Justin Domke, Daniel Sheldon

stat.ML updates on arXiv.org arxiv.org

We present a novel approach for black-box VI that bypasses the difficulties
of stochastic gradient ascent, including the task of selecting step-sizes. Our
approach involves using a sequence of sample average approximation (SAA)
problems. SAA approximates the solution of stochastic optimization problems by
transforming them into deterministic ones. We use quasi-Newton methods and line
search to solve each deterministic optimization problem and present a heuristic
policy to automate hyperparameter selection. Our experiments show that our
method simplifies the VI problem …

approximation arxiv automate box gradient hyperparameter line novel optimization performance policy search solution stochastic

Data Scientist (m/f/x/d)

@ Symanto Research GmbH & Co. KG | Spain, Germany

Data Operations Analyst

@ Workday | Poland, Warsaw

Reference Data Specialist - Operations Analyst

@ JPMorgan Chase & Co. | Bengaluru, Karnataka, India

Data Scientist (Redwood City)

@ Anomali | Redwood City, CA

Software Engineer, Database - Languages & Relational Technologies

@ YugabyteDB | United States (Remote) or Sunnyvale, CA

Data Analyst (m/f/d) Online Marketing

@ StepStone Group | Düsseldorf, Germany