Aug. 29, 2022, 1:11 a.m. | Maria Osório, Luís Sa-Couto, Andreas Wichert

cs.LG updates on arXiv.org arxiv.org

It is generally assumed that the brain uses something akin to sparse
distributed representations. These representations, however, are
high-dimensional and consequently they affect classification performance of
traditional Machine Learning models due to "the curse of dimensionality". In
tasks for which there is a vast amount of labeled data, Deep Networks seem to
solve this issue with many layers and a non-Hebbian backpropagation algorithm.
The brain, however, seems to be able to solve the problem with few layers. In
this work, …

arxiv data dimensionality distributed distributed data learning the curse of dimensionality

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

AI Engineering Manager

@ M47 Labs | Barcelona, Catalunya [Cataluña], Spain