all AI news
Scaling physics-informed hard constraints with mixture-of-experts
Feb. 22, 2024, 5:41 a.m. | Nithin Chalapathi, Yiheng Du, Aditi Krishnapriyan
cs.LG updates on arXiv.org arxiv.org
Abstract: Imposing known physical constraints, such as conservation laws, during neural network training introduces an inductive bias that can improve accuracy, reliability, convergence, and data efficiency for modeling physical dynamics. While such constraints can be softly imposed via loss function penalties, recent advancements in differentiable physics and optimization improve performance by incorporating PDE-constrained optimization as individual layers in neural networks. This enables a stricter adherence to physical constraints. However, imposing hard constraints significantly increases computational and …
abstract accuracy arxiv bias conservation constraints convergence cs.ai cs.lg cs.na data differentiable dynamics efficiency experts function inductive laws loss math.na math.oc modeling network network training neural network optimization performance physics physics-informed reliability scaling training type via
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Field Sample Specialist (Air Sampling) - Eurofins Environment Testing – Pueblo, CO
@ Eurofins | Pueblo, CO, United States
Camera Perception Engineer
@ Meta | Sunnyvale, CA