all AI news
Explaining the physics of transfer learning a data-driven subgrid-scale closure to a different turbulent flow. (arXiv:2206.03198v1 [physics.flu-dyn])
cs.LG updates on arXiv.org arxiv.org
Transfer learning (TL) is becoming a powerful tool in scientific applications
of neural networks (NNs), such as weather/climate prediction and turbulence
modeling. TL enables out-of-distribution generalization (e.g., extrapolation in
parameters) and effective blending of disparate training sets (e.g.,
simulations and observations). In TL, selected layers of a NN, already trained
for a base system, are re-trained using a small dataset from a target system.
For effective TL, we need to know 1) what are the best layers to re-train? and …
arxiv data data-driven flow learning physics scale transfer transfer learning