Web: http://arxiv.org/abs/2201.08702

Jan. 24, 2022, 2:10 a.m. | Qianben Chen, Richong Zhang, Yaowei Zheng, Yongyi Mao

cs.LG updates on arXiv.org arxiv.org

Contrastive learning has achieved remarkable success in representation
learning via self-supervision in unsupervised settings. However, effectively
adapting contrastive learning to supervised learning tasks remains as a
challenge in practice. In this work, we introduce a dual contrastive learning
(DualCL) framework that simultaneously learns the features of input samples and
the parameters of classifiers in the same space. Specifically, DualCL regards
the parameters of the classifiers as augmented samples associating to different
labels and then exploits the contrastive learning between the …

arxiv augmentation classification data learning text text classification

Engineering Manager, Machine Learning (Credit Engineering)

@ Affirm | Remote Poland

Sr Data Engineer

@ Rappi | [CO] Bogotá

Senior Analytics Engineer

@ GetGround | Porto

Senior Staff Software Engineer, Data Engineering

@ Galileo, Inc. | New York City or Remote

Data Engineer

@ Atlassian | Bengaluru, India

Data Engineer | Hybrid (Pune)

@ Velotio | Pune, Maharashtra, India