June 20, 2022, 1:11 a.m. | Guy Blanc, Jane Lange, Ali Malik, Li-Yang Tan

cs.LG updates on arXiv.org arxiv.org

Using the framework of boosting, we prove that all impurity-based decision
tree learning algorithms, including the classic ID3, C4.5, and CART, are highly
noise tolerant. Our guarantees hold under the strongest noise model of nasty
noise, and we provide near-matching upper and lower bounds on the allowable
noise rate. We further show that these algorithms, which are simple and have
long been central to everyday machine learning, enjoy provable guarantees in
the noisy setting that are unmatched by existing algorithms …

algorithms arxiv decision lg noise popular tree

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Social Insights & Data Analyst (Freelance)

@ Media.Monks | Jakarta

Cloud Data Engineer

@ Arkatechture | Portland, ME, USA