all AI news
Manifold Learning by Mixture Models of VAEs for Inverse Problems
June 13, 2024, 4:49 a.m. | Giovanni S. Alberti, Johannes Hertrich, Matteo Santacesaria, Silvia Sciutto
stat.ML updates on arXiv.org arxiv.org
Abstract: Representing a manifold of very high-dimensional data with generative models has been shown to be computationally efficient in practice. However, this requires that the data manifold admits a global parameterization. In order to represent manifolds of arbitrary topology, we propose to learn a mixture model of variational autoencoders. Here, every encoder-decoder pair represents one chart of a manifold. We propose a loss function for maximum likelihood estimation of the model weights and choose an architecture …
abstract arxiv cs.lg data generative generative models global however learn manifold practice replace stat.ml topology type
More from arxiv.org / stat.ML updates on arXiv.org
Proximal Interacting Particle Langevin Algorithms
3 days, 8 hours ago |
arxiv.org
Cluster Quilting: Spectral Clustering for Patchwork Learning
3 days, 8 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Senior Data Engineer
@ Displate | Warsaw
Content Designer
@ Glean | Palo Alto, CA
IT&D Data Solution Architect
@ Reckitt | Hyderabad, Telangana, IN, N/A
Python Developer
@ Riskinsight Consulting | Hyderabad, Telangana, India
Technical Lead (Java/Node.js)
@ LivePerson | Hyderabad, Telangana, India (Remote)
Backend Engineer - Senior and Mid-Level - Sydney Hybrid or AU remote
@ Displayr | Sydney, New South Wales, Australia