all AI news
[R] 1nn with subsampling is infinity-nn with a specific set of weights
Sept. 28, 2022, 9:38 p.m. | /u/SantyClause
Machine Learning www.reddit.com
We're concerned with the accuracy of ~100 estimates added together rather than the accuracy of any single estimate. Thus, bias was important to us. As long as we didn't systemically over or under predict, we would have a very good estimate in the end overall.
I discovered recently that 1nn had worse precision than knn but the bias was significantly better. So I then thought to …
More from www.reddit.com / Machine Learning
Jobs in AI, ML, Big Data
Senior ML Researcher - 3D Geometry Processing | 3D Shape Generation | 3D Mesh Data
@ Promaton | Europe
Data Architect
@ Western Digital | San Jose, CA, United States
Senior Data Scientist GenAI (m/w/d)
@ Deutsche Telekom | Bonn, Deutschland
Senior Data Engineer, Telco (Remote)
@ Lightci | Toronto, Ontario
Consultant Data Architect/Engineer H/F - Innovative Tech
@ Devoteam | Lyon, France
(Senior) ML Engineer / Software Engineer Machine Learning & AI (m/f/x) onsite or remote (in Germany or Austria)
@ Scalable GmbH | Wien, Germany