June 3, 2022, 1:29 p.m. | Alexander Kovalenko

Towards Data Science - Medium towardsdatascience.com

An alternative way to “train” a neural network with evolution

Photo by Misael Moreno on Unsplash

At the moment, backpropagation is nearly the only way for neural network training. It computes the gradient of the loss function with respect to the weights of the network and consequently updates the weights by backpropagating the error across the network layers using the chain rule.

Recently, the community has been trying to go beyond backpropagation, as, maestro Geoffrey Hinton himself, who popularized the …

backpropagation evolution genetic-algorithm gradient-descent network neural network neural networks

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Senior Applied Data Scientist

@ dunnhumby | London

Principal Data Architect - Azure & Big Data

@ MGM Resorts International | Home Office - US, NV