all AI news
Question about batch normalization
Aug. 15, 2022, 5:11 p.m. | /u/InvokeMeWell
Deep Learning www.reddit.com
I am reading the book Deep learning with python 2nd edition from Francois Chollet and on chapter 9.3.3 (p. 257). Introduces the batch normalization and it recommends to write like that:
x = layers.Conv2D(32, 3, use\_bias=False)(x)
x = layers. BatchNormalization() (x)
x = layers.Activation("relu")(x)
instead of
x = layers.Conv2D(32, 3, use\_bias=False, activation = "relu")(x)
x = layers. BatchNormalization() (x)
And explains it liek that:
"Doing normalization before the activation maximizes ther utalization of the relu.
I understand this because …
More from www.reddit.com / Deep Learning
Final Year Project Ideas
4 days, 15 hours ago |
www.reddit.com
Conditioning mechanism in DiT
5 days, 6 hours ago |
www.reddit.com
Learning deep learning from scratch
5 days, 18 hours ago |
www.reddit.com
Deep Learning Theory and Interpretability books and resources
6 days, 14 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Data Engineer
@ Bosch Group | San Luis Potosí, Mexico
DATA Engineer (H/F)
@ Renault Group | FR REN RSAS - Le Plessis-Robinson (Siège)
Advisor, Data engineering
@ Desjardins | 1, Complexe Desjardins, Montréal
Data Engineer Intern
@ Getinge | Wayne, NJ, US
Software Engineer III- Java / Python / Pyspark / ETL
@ JPMorgan Chase & Co. | Jersey City, NJ, United States
Lead Data Engineer (Azure/AWS)
@ Telstra | Telstra ICC Bengaluru