Web: http://arxiv.org/abs/2202.12387

Sept. 22, 2022, 1:12 a.m. | Zhuoning Yuan, Yuexin Wu, Zi-Hao Qiu, Xianzhi Du, Lijun Zhang, Denny Zhou, Tianbao Yang

cs.LG updates on arXiv.org arxiv.org

In this paper, we study contrastive learning from an optimization
perspective, aiming to analyze and address a fundamental issue of existing
contrastive learning methods that either rely on a large batch size or a large
dictionary of feature vectors. We consider a global objective for contrastive
learning, which contrasts each positive pair with all negative pairs for an
anchor point. From the optimization perspective, we explain why existing
methods such as SimCLR require a large batch size in order to …

arxiv global optimization performance small stochastic

More from arxiv.org / cs.LG updates on arXiv.org

Research Scientists

@ ODU Research Foundation | Norfolk, Virginia

Embedded Systems Engineer (Robotics)

@ Neo Cybernetica | Bedford, New Hampshire

2023 Luis J. Alvarez and Admiral Grace M. Hopper Postdoc Fellowship in Computing Sciences

@ Lawrence Berkeley National Lab | San Francisco, CA

Senior Manager Data Scientist

@ NAV | Remote, US

Senior AI Research Scientist

@ Earth Species Project | Remote anywhere

Research Fellow- Center for Security and Emerging Technology (Multiple Opportunities)

@ University of California Davis | Washington, DC

Staff Fellow - Data Scientist

@ U.S. FDA/Center for Devices and Radiological Health | Silver Spring, Maryland

Staff Fellow - Senior Data Engineer

@ U.S. FDA/Center for Devices and Radiological Health | Silver Spring, Maryland

BI Data Analyst

@ EquipmentShare | Remote: Kansas City; Denver; Columbia MO

2023 Data Science Intern

@ Dialexa | Dallas, Texas, United States

Senior Data Engineer - Gdańsk (Remote)

@ Craft | Gdańsk, Pomeranian Voivodeship, Poland

Scientist / Sr. Scientist, Machine Learning & Computational Biology (Genomics)

@ 23andMe | Chicago, Illinois