all AI news
[D] Intuition behind GAN augmentation
I've seen several papers that train a conditional GAN on a labelled training set and then use that cGAN to generate more training data for a supervised model, leading to better performance than the baseline approach of just training the supervised model on the original data. Sometimes there's an extra step to throw away any garbage outputs from the cGAN, but this usually doesn't require any extra supervision.
I used to see this as a "free lunch", but now I …!-->