Aug. 18, 2023, 9:54 a.m. | /u/LeanderKu

Neural Networks, Deep Learning and Machine Learning www.reddit.com

I just stumbled upon a neat blog-post on how ReLUs approximate non-linear functions.

When talking to others I often feel like the piece-wise linearity (even of composed layers) is not on peoples mind, but I often think about it when imagining the behavior of networks.

behavior blog functions linear mind networks neuralnetworks non-linear think

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Director, Clinical Data Science

@ Aura | Remote USA

Research Scientist, AI (PhD)

@ Meta | Menlo Park, CA | New York City