all AI news
Separation of Scales and a Thermodynamic Description of Feature Learning in Some CNNs. (arXiv:2112.15383v3 [stat.ML] UPDATED)
Sept. 26, 2022, 1:12 a.m. | Inbar Seroussi, Gadi Naveh, Zohar Ringel
stat.ML updates on arXiv.org arxiv.org
Deep neural networks (DNNs) are powerful tools for compressing and distilling
information. Their scale and complexity, often involving billions of
inter-dependent parameters, render direct microscopic analysis difficult. Under
such circumstances, a common strategy is to identify slow variables that
average the erratic behavior of the fast microscopic variables. Here, we
identify a similar separation of scales occurring in fully trained finitely
over-parameterized deep convolutional neural networks (CNNs) and fully
connected networks (FCNs). Specifically, we show that DNN layers couple only …
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Senior ML Researcher - 3D Geometry Processing | 3D Shape Generation | 3D Mesh Data
@ Promaton | Europe
Data Architect
@ Western Digital | San Jose, CA, United States
Senior Data Scientist GenAI (m/w/d)
@ Deutsche Telekom | Bonn, Deutschland
Senior Data Engineer, Telco (Remote)
@ Lightci | Toronto, Ontario
Consultant Data Architect/Engineer H/F - Innovative Tech
@ Devoteam | Lyon, France
(Senior) ML Engineer / Software Engineer Machine Learning & AI (m/f/x) onsite or remote (in Germany or Austria)
@ Scalable GmbH | Wien, Germany