all AI news
The Role of Depth, Width, and Activation Complexity in the Number of Linear Regions of Neural Networks. (arXiv:2206.08615v1 [cs.LG])
Web: http://arxiv.org/abs/2206.08615
June 20, 2022, 1:12 a.m. | Alexis Goujon, Arian Etemadi, Michael Unser
stat.ML updates on arXiv.org arxiv.org
Many feedforward neural networks generate continuous and piecewise-linear
(CPWL) mappings. Specifically, they partition the input domain into regions on
which the mapping is an affine function. The number of these so-called linear
regions offers a natural metric to characterize the expressiveness of CPWL
mappings. Although the precise determination of this quantity is often out of
reach, bounds have been proposed for specific architectures, including the
well-known ReLU and Maxout networks. In this work, we propose a more general
perspective and …
arxiv complexity lg linear networks neural neural networks role
More from arxiv.org / stat.ML updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY