all AI news
Laplace Matching for fast Approximate Inference in Latent Gaussian Models. (arXiv:2105.03109v2 [cs.LG] UPDATED)
Oct. 12, 2022, 1:12 a.m. | Marius Hobbhahn, Philipp Hennig
cs.LG updates on arXiv.org arxiv.org
Bayesian inference on non-Gaussian data is often non-analytic and requires
computationally expensive approximations such as sampling or variational
inference. We propose an approximate inference framework primarily designed to
be computationally cheap while still achieving high approximation quality. The
concept, which we call Laplace Matching, involves closed-form, approximate,
bi-directional transformations between the parameter spaces of exponential
families. These are constructed from Laplace approximations under
custom-designed basis transformations. The mappings can then be leveraged to
effectively turn a latent Gaussian distribution into …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Scientist (m/f/x/d)
@ Symanto Research GmbH & Co. KG | Spain, Germany
AI Scientist/Engineer
@ OKX | Singapore
Research Engineering/ Scientist Associate I
@ The University of Texas at Austin | AUSTIN, TX
Senior Data Engineer
@ Algolia | London, England
Fundamental Equities - Vice President, Equity Quant Research Analyst (Income & Value Investment Team)
@ BlackRock | NY7 - 50 Hudson Yards, New York
Snowflake Data Analytics
@ Devoteam | Madrid, Spain