all AI news
Residual Multi-Fidelity Neural Network Computing
March 7, 2024, 5:43 a.m. | Owen Davis, Mohammad Motamed, Raul Tempone
cs.LG updates on arXiv.org arxiv.org
Abstract: In this work, we consider the general problem of constructing a neural network surrogate model using multi-fidelity information. Motivated by rigorous error and complexity estimates for ReLU neural networks, given an inexpensive low-fidelity and an expensive high-fidelity computational model, we present a residual multi-fidelity computational framework that formulates the correlation between models as a residual function, a possibly non-linear mapping between 1) the shared input space of the models together with the low-fidelity model output …
abstract arxiv complexity computational computing correlation cs.lg cs.na error fidelity framework general information low math.na network networks neural network neural networks relu residual type work
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Machine Learning Engineer
@ Samsara | Canada - Remote