all AI news
Can a Hebbian-like learning rule be avoiding the curse of dimensionality in sparse distributed data?. (arXiv:2208.12564v1 [cs.NE])
Aug. 29, 2022, 1:11 a.m. | Maria Osório, Luís Sa-Couto, Andreas Wichert
cs.LG updates on arXiv.org arxiv.org
It is generally assumed that the brain uses something akin to sparse
distributed representations. These representations, however, are
high-dimensional and consequently they affect classification performance of
traditional Machine Learning models due to "the curse of dimensionality". In
tasks for which there is a vast amount of labeled data, Deep Networks seem to
solve this issue with many layers and a non-Hebbian backpropagation algorithm.
The brain, however, seems to be able to solve the problem with few layers. In
this work, …
arxiv data dimensionality distributed distributed data learning the curse of dimensionality
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
AI Engineering Manager
@ M47 Labs | Barcelona, Catalunya [Cataluña], Spain