June 3, 2022, 1:29 p.m. | Alexander Kovalenko

Towards Data Science - Medium towardsdatascience.com

An alternative way to “train” a neural network with evolution

Photo by Misael Moreno on Unsplash

At the moment, backpropagation is nearly the only way for neural network training. It computes the gradient of the loss function with respect to the weights of the network and consequently updates the weights by backpropagating the error across the network layers using the chain rule.

Recently, the community has been trying to go beyond backpropagation, as, maestro Geoffrey Hinton himself, who popularized the …

backpropagation evolution genetic-algorithm gradient-descent network neural network neural networks

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Alternant Data Engineering

@ Aspire Software | Angers, FR

Senior Software Engineer, Generative AI

@ Google | Dublin, Ireland