March 20, 2024, 4:42 a.m. | Dongshu Liu, J\'er\'emie Laydevant, Adrien Pontlevy, Damien Querlioz, Julie Grollier

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.12116v1 Announce Type: cross
Abstract: Current unsupervised learning methods depend on end-to-end training via deep learning techniques such as self-supervised learning, with high computational requirements, or employ layer-by-layer training using bio-inspired approaches like Hebbian learning, using local learning rules incompatible with supervised learning. Both approaches are problematic for edge AI hardware that relies on sparse computational resources and would strongly benefit from alternating between unsupervised and supervised learning phases - thus leveraging widely available unlabeled data from the environment as …

abstract ai hardware arxiv bio bio-inspired computational cs.et cs.lg cs.ne current deep learning deep learning techniques edge edge ai hardware layer requirements rules self-supervised learning supervised learning training type unsupervised unsupervised learning via

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Software Engineering Manager, Generative AI - Characters

@ Meta | Bellevue, WA | Menlo Park, CA | Seattle, WA | New York City | San Francisco, CA

Senior Operations Research Analyst / Predictive Modeler

@ LinQuest | Colorado Springs, Colorado, United States