all AI news
Representation learning for maximization of MI, nonlinear ICA and nonlinear subspaces with robust density ratio estimation. (arXiv:2101.02083v2 [cs.LG] UPDATED)
Aug. 10, 2022, 1:11 a.m. | Hiroaki Sasaki, Takashi Takenouchi
stat.ML updates on arXiv.org arxiv.org
Contrastive learning is a recent promising approach in unsupervised
representation learning where a feature representation of data is learned by
solving a pseudo classification problem from unlabelled data. However, it is
not straightforward to understand what representation contrastive learning
yields. In addition, contrastive learning is often based on the maximum
likelihood estimation, which tends to be vulnerable to the contamination by
outliers. To promote the understanding to contrastive learning, this paper
first theoretically shows a connection to maximization of mutual …
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Senior ML Researcher - 3D Geometry Processing | 3D Shape Generation | 3D Mesh Data
@ Promaton | Europe
Senior AI Engineer, EdTech (Remote)
@ Lightci | Toronto, Ontario
Data Scientist for Salesforce Applications
@ ManTech | 781G - Customer Site,San Antonio,TX
AI Research Scientist
@ Gridmatic | Cupertino, CA
Data Engineer
@ Global Atlantic Financial Group | Boston, Massachusetts, United States
Machine Learning Engineer - Conversation AI
@ DoorDash | Sunnyvale, CA; San Francisco, CA; Seattle, WA; Los Angeles, CA