all AI news
Gradient-free online learning of subgrid-scale dynamics with neural emulators. (arXiv:2310.19385v2 [physics.comp-ph] UPDATED)
cs.LG updates on arXiv.org arxiv.org
In this paper, we propose a generic algorithm to train machine learning-based
subgrid parametrizations online, i.e., with $\textit{a posteriori}$ loss
functions for non-differentiable numerical solvers. The proposed approach
leverage neural emulators to train an approximation of the reduced state-space
solver, which is then used to allows gradient propagation through temporal
integration steps. The algorithm is able to recover most of the benefit of
online strategies without having to compute the gradient of the original
solver. It is demonstrated that training …
algorithm approximation arxiv differentiable dynamics free functions gradient loss machine machine learning numerical online learning paper physics propagation scale solver space state train