Oct. 3, 2022, 1:13 a.m. | Danilo Samuel Jodas, Leandro Aparecido Passos, Ahsan Adeel, João Paulo Papa

cs.LG updates on arXiv.org arxiv.org

Demands for minimum parameter setup in machine learning models are desirable
to avoid time-consuming optimization processes. The $k$-Nearest Neighbors is
one of the most effective and straightforward models employed in numerous
problems. Despite its well-known performance, it requires the value of $k$ for
specific data distribution, thus demanding expensive computational efforts.
This paper proposes a $k$-Nearest Neighbors classifier that bypasses the need
to define the value of $k$. The model computes the $k$ value adaptively
considering the data distribution of …

arxiv classifier knn neighbors

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote