Aug. 31, 2023, 10:57 a.m. | /u/Excalibur-Prime

Deep Learning www.reddit.com

So I took the Andrew Ng course 2 of deep learning specialization. In it Andrew was normalizing input by subtracting mean and dividing it by the variance. Variance was calculated after subtracting mean from each example but on google I saw that there is the Z-score normalization where standard deviation is being used to divide the inputs by.
What is the accurate way of normalizing the inputs?

I am currently normalizing using the following python code
Input is a dataset …

andrew andrew ng andrew ng course course deep learning deeplearning deviation example google google i mean normalization standard variance z-score

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US