all AI news
R\'enyiCL: Contrastive Representation Learning with Skew R\'enyi Divergence. (arXiv:2208.06270v1 [stat.ML])
stat.ML updates on arXiv.org arxiv.org
Contrastive representation learning seeks to acquire useful representations
by estimating the shared information between multiple views of data. Here, the
choice of data augmentation is sensitive to the quality of learned
representations: as harder the data augmentations are applied, the views share
more task-relevant information, but also task-irrelevant one that can hinder
the generalization capability of representation. Motivated by this, we present
a new robust contrastive learning scheme, coined R\'enyiCL, which can
effectively manage harder augmentations by utilizing R\'enyi divergence. …
arxiv divergence learning ml representation representation learning skew