all AI news
Neural Conservation Laws: A Divergence-Free Perspective. (arXiv:2210.01741v1 [cs.LG])
Oct. 5, 2022, 1:12 a.m. | Jack Richter-Powell, Yaron Lipman, Ricky T. Q. Chen
cs.LG updates on arXiv.org arxiv.org
We investigate the parameterization of deep neural networks that by design
satisfy the continuity equation, a fundamental conservation law. This is
enabled by the observation that solutions of the continuity equation can be
represented as a divergence-free vector field. We hence propose building
divergence-free neural networks through the concept of differential forms, and
with the aid of automatic differentiation, realize two practical constructions.
As a result, we can parameterize pairs of densities and vector fields that
always satisfy the continuity …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Sr. Software Development Manager, AWS Neuron Machine Learning Distributed Training
@ Amazon.com | Cupertino, California, USA