March 20, 2024, 4:42 a.m. | Dongshu Liu, J\'er\'emie Laydevant, Adrien Pontlevy, Damien Querlioz, Julie Grollier

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.12116v1 Announce Type: cross
Abstract: Current unsupervised learning methods depend on end-to-end training via deep learning techniques such as self-supervised learning, with high computational requirements, or employ layer-by-layer training using bio-inspired approaches like Hebbian learning, using local learning rules incompatible with supervised learning. Both approaches are problematic for edge AI hardware that relies on sparse computational resources and would strongly benefit from alternating between unsupervised and supervised learning phases - thus leveraging widely available unlabeled data from the environment as …

abstract ai hardware arxiv bio bio-inspired computational cs.et cs.lg cs.ne current deep learning deep learning techniques edge edge ai hardware layer requirements rules self-supervised learning supervised learning training type unsupervised unsupervised learning via

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York