Jan. 1, 2023, midnight | William J. Wilkinson, Simo Särkkä, Arno Solin

JMLR www.jmlr.org

We formulate natural gradient variational inference (VI), expectation propagation (EP), and posterior linearisation (PL) as extensions of Newton's method for optimising the parameters of a Bayesian posterior distribution. This viewpoint explicitly casts inference algorithms under the framework of numerical optimisation. We show that common approximations to Newton's method from the optimisation literature, namely Gauss-Newton and quasi-Newton methods (e.g., the BFGS algorithm), are still valid under this 'Bayes-Newton' framework. This leads to a suite of novel algorithms which are guaranteed to …

algorithm algorithms bayes bayesian bayesian inference covariance distribution extensions framework gradient inference leads literature natural novel numerical optimisation positive posterior propagation show standard

Data Scientist (m/f/x/d)

@ Symanto Research GmbH & Co. KG | Spain, Germany

Sr. Data Science Consultant

@ Blue Yonder | Bengaluru

Artificial Intelligence Developer

@ HP | PSR01 - Bengaluru, Pritech Park- SEZ (PSR01)

Senior Software Engineer - Cloud Data Extraction

@ Celonis | Munich, Germany

Finance Master Data Management

@ Airbus | Lisbon (Airbus Portugal)

Imaging Support Associate

@ Lexington Medical Center | West Columbia, SC, US, 29169