May 26, 2022, 1:11 a.m. | Pierre Wolinski, Julyan Arbel

stat.ML updates on arXiv.org arxiv.org

The goal of the present work is to propose a way to modify both the
initialization distribution of the weights of a neural network and its
activation function, such that all pre-activations are Gaussian. We propose a
family of pairs initialization/activation, where the activation functions span
a continuum from bounded functions (such as Heaviside or tanh) to the identity
function.


This work is motivated by the contradiction between existing works dealing
with Gaussian pre-activations: on one side, the works in …

arxiv network neural network

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Management Associate

@ EcoVadis | Ebène, Mauritius

Senior Data Engineer

@ Telstra | Telstra ICC Bengaluru