April 17, 2023, 8:02 p.m. | Louis Fortier-Dubois, Gaël Letarte, Benjamin Leblanc, François Laviolette, Pascal Germain

cs.LG updates on arXiv.org arxiv.org

Considering a probability distribution over parameters is known as an
efficient strategy to learn a neural network with non-differentiable activation
functions. We study the expectation of a probabilistic neural network as a
predictor by itself, focusing on the aggregation of binary activated neural
networks with normal distributions over real-valued weights. Our work leverages
a recent analysis derived from the PAC-Bayesian framework that derives tight
generalization bounds and learning procedures for the expected output value of
such an aggregation, which is …

aggregation analysis arxiv bayesian binary distribution framework learn network networks neural network neural networks normal probability strategy study value work

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Strategy & Management - Private Equity Sector - Manager - Consulting - Location OPEN

@ EY | New York City, US, 10001-8604

Data Engineer- People Analytics

@ Volvo Group | Gothenburg, SE, 40531