July 20, 2022, 1:11 a.m. | Ziqiang Li, Chaoyue Wang, Heliang Zheng, Jing Zhang, Bin Li

cs.LG updates on arXiv.org arxiv.org

Data-Efficient GANs (DE-GANs), which aim to learn generative models with a
limited amount of training data, encounter several challenges for generating
high-quality samples. Since data augmentation strategies have largely
alleviated the training instability, how to further improve the generative
performance of DE-GANs becomes a hotspot. Recently, contrastive learning has
shown the great potential of increasing the synthesis quality of DE-GANs, yet
related principles are not well explored. In this paper, we revisit and compare
different contrastive learning strategies in DE-GANs, …

arxiv cv data discontinuity gans learning

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote