May 23, 2024, 1:58 a.m. | /u/guesswho135

Machine Learning www.reddit.com

I watched a lecture of his from about 4 years ago where he swats down all of the objections to backpropogation as a learning mechanism in the brain. But I recall hearing him on a podcast more recently (can't find it anymore) in which he was skeptical of backprop, and seemed to suggest that Hebbian learning was more important. I'm curious to know his current beliefs and why. What it the most recent interview or lecture where he discusses this? …

brain current geoff geoff hinton hearing hinton lecture machinelearning podcast recall thoughts

Senior Data Engineer

@ Displate | Warsaw

Director of Data Science (f/m/x)

@ AUTO1 Group | Berlin, Germany

Business Intelligence Analyst I [BI Analyst I]

@ Capitec Bank | Stellenbosch, Western Cape, ZA

Data Governance Associate Director

@ Publicis Groupe | London, United Kingdom

Technical Lead - Power BI

@ Birlasoft | INDIA - PUNE - BIRLASOFT OFFICE - HINJAWADI, IN

Data Analyst

@ FirstRand Corporate Centre | 1 First Place, Cnr Simmonds & Pritchard Streets, Johannesburg, 2001