Aug. 15, 2022, 5:11 p.m. | /u/InvokeMeWell

Deep Learning www.reddit.com

Hello,

I am reading the book Deep learning with python 2nd edition from Francois Chollet and on chapter 9.3.3 (p. 257). Introduces the batch normalization and it recommends to write like that:

x = layers.Conv2D(32, 3, use\_bias=False)(x)

x = layers. BatchNormalization() (x)

x = layers.Activation("relu")(x)

instead of

x = layers.Conv2D(32, 3, use\_bias=False, activation = "relu")(x)

x = layers. BatchNormalization() (x)

And explains it liek that:

"Doing normalization before the activation maximizes ther utalization of the relu.

I understand this because …

deeplearning normalization

Data Engineer

@ Bosch Group | San Luis Potosí, Mexico

DATA Engineer (H/F)

@ Renault Group | FR REN RSAS - Le Plessis-Robinson (Siège)

Advisor, Data engineering

@ Desjardins | 1, Complexe Desjardins, Montréal

Data Engineer Intern

@ Getinge | Wayne, NJ, US

Software Engineer III- Java / Python / Pyspark / ETL

@ JPMorgan Chase & Co. | Jersey City, NJ, United States

Lead Data Engineer (Azure/AWS)

@ Telstra | Telstra ICC Bengaluru