Web: http://arxiv.org/abs/2112.10510

Jan. 26, 2022, 2:11 a.m. | Samuel Müller, Noah Hollmann, Sebastian Pineda Arango, Josif Grabocka, Frank Hutter

cs.LG updates on arXiv.org arxiv.org

Currently, it is hard to reap the benefits of deep learning for Bayesian
methods, which allow the explicit specification of prior knowledge and
accurately capture model uncertainty. We present Prior-Data Fitted Networks
(PFNs). PFNs leverage large-scale machine learning techniques to approximate a
large set of posteriors. The only requirement for PFNs to work is the ability
to sample from a prior distribution over supervised learning tasks (or
functions). Our method restates the objective of posterior approximation as a
supervised classification …

arxiv bayesian bayesian inference transformers

More from arxiv.org / cs.LG updates on arXiv.org

Director, Data Science (Advocacy & Nonprofit)

@ Civis Analytics | Remote

Data Engineer

@ Rappi | [CO] Bogotá

Data Scientist V, Marketplaces Personalization (Remote)

@ ID.me | United States (U.S.)

Product OPs Data Analyst (Flex/Remote)

@ Scaleway | Paris

Big Data Engineer

@ Risk Focus | Riga, Riga, Latvia

Internship Program: Machine Learning Backend

@ Nextail | Remote job