Nov. 15, 2022, 2:12 a.m. | Timoleon Moraitis, Dmitry Toichkin, Adrien Journé, Yansong Chua, Qinghai Guo

cs.LG updates on arXiv.org arxiv.org

Hebbian plasticity in winner-take-all (WTA) networks is highly attractive for
neuromorphic on-chip learning, owing to its efficient, local, unsupervised, and
on-line nature. Moreover, its biological plausibility may help overcome
important limitations of artificial algorithms, such as their susceptibility to
adversarial attacks, and their high demands for training-example quantity and
repetition. However, Hebbian WTA learning has found little use in machine
learning (ML), likely because it has been missing an optimization theory
compatible with deep learning (DL). Here we show rigorously …

arxiv bayesian bayesian inference inference networks unsupervised

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

C003549 Data Analyst (NS) - MON 13 May

@ EMW, Inc. | Braine-l'Alleud, Wallonia, Belgium

Marketing Decision Scientist

@ Meta | Menlo Park, CA | New York City