all AI news
Understanding the Role of Nonlinearity in Training Dynamics of Contrastive Learning. (arXiv:2206.01342v1 [cs.LG])
June 6, 2022, 1:10 a.m. | Yuandong Tian
cs.LG updates on arXiv.org arxiv.org
While the empirical success of self-supervised learning (SSL) heavily relies
on the usage of deep nonlinear models, many theoretical works proposed to
understand SSL still focus on linear ones. In this paper, we study the role of
nonlinearity in the training dynamics of contrastive learning (CL) on one and
two-layer nonlinear networks with homogeneous activation $h(x) = h'(x)x$. We
theoretically demonstrate that (1) the presence of nonlinearity leads to many
local optima even in 1-layer setting, each corresponding to certain …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Technology Consultant Master Data Management (w/m/d)
@ SAP | Walldorf, DE, 69190
Research Engineer, Computer Vision, Google Research
@ Google | Nairobi, Kenya