Web: http://arxiv.org/abs/2205.01940

May 5, 2022, 1:12 a.m. | Jie Ren, Mingjie Li, Meng Zhou, Shih-Han Chan, Quanshi Zhang

cs.LG updates on arXiv.org arxiv.org

This paper aims to theoretically analyze the complexity of feature
transformations encoded in DNNs with ReLU layers. We propose metrics to measure
three types of complexities of transformations based on the information theory.
We further discover and prove the strong correlation between the complexity and
the disentanglement of transformations. Based on the proposed metrics, we
analyze two typical phenomena of the change of the transformation complexity
during the training process, and explore the ceiling of a DNN's complexity. The
proposed …

analysis arxiv complexity relu transformation

More from arxiv.org / cs.LG updates on arXiv.org

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC

Senior Data Science Writer

@ NannyML | Remote

Director of AI/ML Engineering

@ Armis Industries | Remote (US only), St. Louis, California

Digital Analytics Manager

@ Patagonia | Ventura, California