all AI news
DCGAN becomes unstable when I don't directly use Binary Cross Entropy for loss calculation.
March 23, 2024, 8:40 a.m. | /u/mono1110
Deep Learning www.reddit.com
When I use the BCE for loss calculation for both generator and discriminator, it works fine.
But when I switch to -log(D(x)) - log(1-D(G(x)) for discriminator and log(1-D(G(x)) for generator, it works fine upto epoch 12. It can generate images, but after that everything goes nan.
What could be the reason for such behavior?
It doesn't matter which loss function I use. The values must be same right?
binary dcgan deeplearning entropy generate generator images loss mnist
More from www.reddit.com / Deep Learning
Classification of images with numerical "continous" categories
2 days, 12 hours ago |
www.reddit.com
How does gradient descent work in random forest
3 days, 4 hours ago |
www.reddit.com
Prerequisites for jumping into transformers?
3 days, 6 hours ago |
www.reddit.com
[Reading] Deeplearning by goodfellow
3 days, 12 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US