April 23, 2024, 4:44 a.m. | Guohao Shen, Yuling Jiao, Yuanyuan Lin, Jian Huang

cs.LG updates on arXiv.org arxiv.org

arXiv:2305.00608v3 Announce Type: replace-cross
Abstract: We study the properties of differentiable neural networks activated by rectified power unit (RePU) functions. We show that the partial derivatives of RePU neural networks can be represented by RePUs mixed-activated networks and derive upper bounds for the complexity of the function class of derivatives of RePUs networks. We establish error bounds for simultaneously approximating $C^s$ smooth functions and their derivatives using RePU-activated deep neural networks. Furthermore, we derive improved approximation error bounds when data …

abstract applications arxiv complexity cs.lg derivatives differentiable functions mixed networks neural networks power regression show stat.ml study type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne