all AI news
DCGAN becomes unstable when I don't directly use Binary Cross Entropy for loss calculation.
March 23, 2024, 8:40 a.m. | /u/mono1110
Deep Learning www.reddit.com
When I use the BCE for loss calculation for both generator and discriminator, it works fine.
But when I switch to -log(D(x)) - log(1-D(G(x)) for discriminator and log(1-D(G(x)) for generator, it works fine upto epoch 12. It can generate images, but after that everything goes nan.
What could be the reason for such behavior?
It doesn't matter which loss function I use. The values must be same right?
binary dcgan deeplearning entropy generate generator images loss mnist
More from www.reddit.com / Deep Learning
What does Speaker Embeddings consists of?
2 days, 6 hours ago |
www.reddit.com
Tensorflow vs pytorch
3 days, 17 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Lead Data Scientist, Commercial Analytics
@ Checkout.com | London, United Kingdom
Data Engineer I
@ Love's Travel Stops | Oklahoma City, OK, US, 73120