March 7, 2024, 5:42 a.m. | Rui Wang, Yuesheng Xu, Mingsong Yan

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.03353v1 Announce Type: cross
Abstract: This paper introduces a hypothesis space for deep learning that employs deep neural networks (DNNs). By treating a DNN as a function of two variables, the physical variable and parameter variable, we consider the primitive set of the DNNs for the parameter variable located in a set of the weight matrices and biases determined by a prescribed depth and widths of the DNNs. We then complete the linear span of the primitive DNN set in …

abstract arxiv cs.lg deep learning dnn function hypothesis math.fa networks neural networks paper set space spaces stat.ml type variables

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior ML Engineer

@ Carousell Group | Ho Chi Minh City, Vietnam

Data and Insight Analyst

@ Cotiviti | Remote, United States