March 7, 2024, 5:43 a.m. | Owen Davis, Mohammad Motamed, Raul Tempone

cs.LG updates on arXiv.org arxiv.org

arXiv:2310.03572v2 Announce Type: replace
Abstract: In this work, we consider the general problem of constructing a neural network surrogate model using multi-fidelity information. Motivated by rigorous error and complexity estimates for ReLU neural networks, given an inexpensive low-fidelity and an expensive high-fidelity computational model, we present a residual multi-fidelity computational framework that formulates the correlation between models as a residual function, a possibly non-linear mapping between 1) the shared input space of the models together with the low-fidelity model output …

abstract arxiv complexity computational computing correlation cs.lg cs.na error fidelity framework general information low math.na network networks neural network neural networks relu residual type work

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Machine Learning Engineer

@ Samsara | Canada - Remote