all AI news
Preventing Model Collapse in Gaussian Process Latent Variable Models
April 3, 2024, 4:42 a.m. | Ying Li, Zhidi Lin, Feng Yin, Michael Minyi Zhang
cs.LG updates on arXiv.org arxiv.org
Abstract: Gaussian process latent variable models (GPLVMs) are a versatile family of unsupervised learning models, commonly used for dimensionality reduction. However, common challenges in modeling data with GPLVMs include inadequate kernel flexibility and improper selection of the projection noise, which leads to a type of model collapse characterized primarily by vague latent representations that do not reflect the underlying structure of the data. This paper addresses these issues by, first, theoretically examining the impact of the …
abstract arxiv challenges cs.lg data dimensionality family flexibility however kernel leads model collapse modeling noise process projection stat.ml type unsupervised unsupervised learning
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Software Engineer, Generative AI (C++)
@ SoundHound Inc. | Toronto, Canada