all AI news
A Common Misconception About Cross Entropy Loss
Feb. 11, 2024, 5:38 p.m. | /u/Toasty_toaster
Data Science www.reddit.com
The misconception is that the network *only* learns from its prediction on the correct class
It is common online to see comments [like this one](https://datascience.stackexchange.com/a/20301/159699), that, while technically true, obfuscate the understanding of how a neural network updates its parameters after training on a single sample in multi-class classification. Other comments, [such as this one](https://datascience.stackexchange.com/a/31966/159699), [and this one](https://datascience.stackexchange.com/questions/20296/cross-entropy-loss-explanation/24696#comment91209_24696), are flat out wrong. This makes studying this topic …
class classification datascience entropy function layer loss network prediction sample softmax
More from www.reddit.com / Data Science
Have Data Scientist Interviews Evolved Over the Last Year?
1 day, 4 hours ago |
www.reddit.com
Tell me about older individual contributors
1 day, 9 hours ago |
www.reddit.com
Pedro Thermo Similarity vs Levenshtain/ OSA/ Jaro/ ..
1 day, 10 hours ago |
www.reddit.com
Struggling on where to plug Python into my workflow
1 day, 11 hours ago |
www.reddit.com
Senior SWE locking down a project
1 day, 12 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US