March 27, 2024, 4:43 a.m. | Zhiwei Fang, Sifan Wang, Paris Perdikaris

cs.LG updates on arXiv.org arxiv.org

arXiv:2302.13143v2 Announce Type: replace
Abstract: While the popularity of physics-informed neural networks (PINNs) is steadily rising, to this date, PINNs have not been successful in simulating multi-scale and singular perturbation problems. In this work, we present a new training paradigm referred to as "gradient boosting" (GB), which significantly enhances the performance of physics informed neural networks (PINNs). Rather than learning the solution of a given PDE using a single neural network directly, our algorithm employs a sequence of neural networks …

abstract arxiv boosting cs.lg cs.na ensemble gradient math.na networks neural networks paradigm physics physics-informed scale singular training type work

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Werkstudent Data Architecture & Governance (w/m/d)

@ E.ON | Essen, DE

Data Architect, Data Lake, Professional Services

@ Amazon.com | Bogota, DC, COL

Data Architect, Data Lake, Professional Services

@ Amazon.com | Buenos Aires City, Buenos Aires Autonomous City, ARG

Data Architect

@ Bitful | United States - Remote