all AI news
Neural Modes: Self-supervised Learning of Nonlinear Modal Subspaces
April 30, 2024, 4:41 a.m. | Jiahong Wang, Yinwei Du, Stelian Coros, Bernhard Thomaszewski
cs.LG updates on arXiv.org arxiv.org
Abstract: We propose a self-supervised approach for learning physics-based subspaces for real-time simulation. Existing learning-based methods construct subspaces by approximating pre-defined simulation data in a purely geometric way. However, this approach tends to produce high-energy configurations, leads to entangled latent space dimensions, and generalizes poorly beyond the training set. To overcome these limitations, we propose a self-supervised approach that directly minimizes the system's mechanical energy during training. We show that our method leads to learned subspaces …
abstract arxiv beyond construct cs.cv cs.gr cs.lg data dimensions energy however leads modal physics real-time self-supervised learning simulation space supervised learning training type
More from arxiv.org / cs.LG updates on arXiv.org
Testing the Segment Anything Model on radiology data
1 day, 12 hours ago |
arxiv.org
Calorimeter shower superresolution
1 day, 12 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US