May 8, 2023, 6:01 p.m. | Luiz doleron

Towards AI - Medium pub.towardsai.net

Let’s have fun by implementing Gradient Descent in pure C++ and Eigen.

In this story, we will cover the fitting of 2D convolution kernels from data by introducing the Gradient Descent algorithm. We will use convolutions and the concept of cost functions introduced in the previous story coding everything in modern C++ and Eigen.

About this series

In this series, we will learn how to code the must-to-know deep learning algorithms such as convolutions, backpropagation, activation functions, optimizers, deep …

algorithm artificial intelligence coding concept convolution cost cpp data deep learning fun gradient machine learning story

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

AI Engineering Manager

@ M47 Labs | Barcelona, Catalunya [Cataluña], Spain