Feb. 5, 2022, 8:50 p.m. | Oscar Leo

Towards Data Science - Medium towardsdatascience.com

How I used batch normalization to get a 20% improvement on my eye-tracker during inference

Photo by Diego PH on Unsplash

Batch normalization is essential for every modern deep learning algorithm. Normalizing output features before passing them on to the next layer stabilizes the training of large neural networks. Of course, that’s not news to anyone interested in deep learning. But did you know that, for some use cases, batch normalization significantly improves testing and inference as well?

In this …

convolutional-network data science deep learning machine learning neural networks

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne