April 3, 2024, 4:42 a.m. | Ying Li, Zhidi Lin, Feng Yin, Michael Minyi Zhang

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.01697v1 Announce Type: cross
Abstract: Gaussian process latent variable models (GPLVMs) are a versatile family of unsupervised learning models, commonly used for dimensionality reduction. However, common challenges in modeling data with GPLVMs include inadequate kernel flexibility and improper selection of the projection noise, which leads to a type of model collapse characterized primarily by vague latent representations that do not reflect the underlying structure of the data. This paper addresses these issues by, first, theoretically examining the impact of the …

abstract arxiv challenges cs.lg data dimensionality family flexibility however kernel leads model collapse modeling noise process projection stat.ml type unsupervised unsupervised learning

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Software Engineer, Generative AI (C++)

@ SoundHound Inc. | Toronto, Canada