May 18, 2023, 5:02 p.m. | /u/pocketjet

Machine Learning www.reddit.com

I just rediscovered an article on visual information theory by Colah: [https://colah.github.io/posts/2015-09-Visual-Information/](https://colah.github.io/posts/2015-09-Visual-Information/)

I've used cross-entropy in different ML projects but never understood it fully. This article explained Entropy as a "continuous analog" of Shannon codes - which I thought offered a unique perspective on this basic concept.

What are some articles you find interesting?

analog article articles concept continuous cross-entropy entropy explained machinelearning ml projects perspective projects thought

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Program Control Data Analyst

@ Ford Motor Company | Mexico

Vice President, Business Intelligence / Data & Analytics

@ AlphaSense | Remote - United States