all AI news
The Asymmetric Maximum Margin Bias of Quasi-Homogeneous Neural Networks. (arXiv:2210.03820v1 [cs.LG])
Oct. 11, 2022, 1:15 a.m. | Daniel Kunin, Atsushi Yamamura, Chao Ma, Surya Ganguli
stat.ML updates on arXiv.org arxiv.org
In this work, we explore the maximum-margin bias of quasi-homogeneous neural
networks trained with gradient flow on an exponential loss and past a point of
separability. We introduce the class of quasi-homogeneous models, which is
expressive enough to describe nearly all neural networks with homogeneous
activations, even those with biases, residual connections, and normalization
layers, while structured enough to enable geometric analysis of its gradient
dynamics. Using this analysis, we generalize the existing results of
maximum-margin bias for homogeneous networks …
More from arxiv.org / stat.ML updates on arXiv.org
Mixture of partially linear experts
2 hours ago |
arxiv.org
Adaptive deep learning for nonlinear time series models
1 day, 2 hours ago |
arxiv.org
A Full Adagrad algorithm with O(Nd) operations
1 day, 2 hours ago |
arxiv.org
Minimax Regret Learning for Data with Heterogeneous Subgroups
1 day, 2 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Sr. BI Analyst
@ AkzoNobel | Pune, IN