April 4, 2023, 10:36 p.m. | Jeremy Howard

Jeremy Howard www.youtube.com

(All lesson resources are available at http://course.fast.ai.) In this lesson, we dive into backpropagation and the creation of a simple Multi-Layer Perceptron (MLP) neural network. We start by reviewing basic neural networks and their architecture, then move on to implementing a simple MLP from scratch. We focus on understanding the chain rule and backpropagation in the context of neural networks, and demonstrate how to calculate derivatives using Python and the SimPy library.

We also discuss the importance of the chain …

derivatives discuss error explore floating point functions gradient importance linear math mean process pytorch relu trick

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Data Science Analyst

@ Mayo Clinic | AZ, United States

Sr. Data Scientist (Network Engineering)

@ SpaceX | Redmond, WA