June 9, 2023, 3:59 p.m. | /u/ilrazziatore

Machine Learning www.reddit.com

It is known that MC-dropout provide a cheap and fast way to approximate the posterior distribution. the quality of such approximation has been criticized by different autohrs ( osband [http://bayesiandeeplearning.org/2016/papers/BDL\_4.pdf](http://bayesiandeeplearning.org/2016/papers/BDL_4.pdf) , folgoc [https://arxiv.org/pdf/2110.04286.pdf](https://arxiv.org/pdf/2110.04286.pdf) , [https://arxiv.org/pdf/2008.02627.pdf](https://arxiv.org/pdf/2008.02627.pdf) )

the idea is that the 1) approximating the posterior with a bunch of deltas is not enough to extract meaningful information from the posterior itself. 2) the uncertainty does not concentrates with observed data so that in the limit of infinite data goes to …

data dropout extract information machinelearning posterior rate uncertainty

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne