all AI news
Imposing Gaussian Pre-Activations in a Neural Network. (arXiv:2205.12379v1 [cs.LG])
May 26, 2022, 1:11 a.m. | Pierre Wolinski, Julyan Arbel
stat.ML updates on arXiv.org arxiv.org
The goal of the present work is to propose a way to modify both the
initialization distribution of the weights of a neural network and its
activation function, such that all pre-activations are Gaussian. We propose a
family of pairs initialization/activation, where the activation functions span
a continuum from bounded functions (such as Heaviside or tanh) to the identity
function.
This work is motivated by the contradiction between existing works dealing
with Gaussian pre-activations: on one side, the works in …
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Management Associate
@ EcoVadis | Ebène, Mauritius
Senior Data Engineer
@ Telstra | Telstra ICC Bengaluru