May 23, 2024, 1:58 a.m. | /u/guesswho135

Machine Learning www.reddit.com

I watched a lecture of his from about 4 years ago where he swats down all of the objections to backpropogation as a learning mechanism in the brain. But I recall hearing him on a podcast more recently (can't find it anymore) in which he was skeptical of backprop, and seemed to suggest that Hebbian learning was more important. I'm curious to know his current beliefs and why. What it the most recent interview or lecture where he discusses this? …

brain current geoff geoff hinton hearing hinton lecture machinelearning podcast recall thoughts

Senior Data Engineer

@ Displate | Warsaw

Associate Director, Technology & Data Lead - Remote

@ Novartis | East Hanover

Product Manager, Generative AI

@ Adobe | San Jose

Associate Director – Data Architect Corporate Functions

@ Novartis | Prague

Principal Data Scientist

@ Salesforce | California - San Francisco

Senior Analyst Data Science

@ Novartis | Hyderabad (Office)