all AI news
Discriminative Bayesian filtering lends momentum to the stochastic Newton method for minimizing log-convex functions. (arXiv:2104.12949v2 [stat.ML] UPDATED)
Web: http://arxiv.org/abs/2104.12949
June 23, 2022, 1:12 a.m. | Michael C. Burkhart
stat.ML updates on arXiv.org arxiv.org
To minimize the average of a set of log-convex functions, the stochastic
Newton method iteratively updates its estimate using subsampled versions of the
full objective's gradient and Hessian. We contextualize this optimization
problem as sequential Bayesian inference on a latent state-space model with a
discriminatively-specified observation process. Applying Bayesian filtering
then yields a novel optimization algorithm that considers the entire history of
gradients and Hessians when forming an update. We establish matrix-based
conditions under which the effect of older observations …
More from arxiv.org / stat.ML updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY