all AI news
Limit Cycles of AdaBoost. (arXiv:2209.06928v1 [cs.LG])
Sept. 16, 2022, 1:13 a.m. | Conor Snedeker
stat.ML updates on arXiv.org arxiv.org
The iterative weight update for the AdaBoost machine learning algorithm may
be realized as a dynamical map on a probability simplex. When learning a
low-dimensional data set this algorithm has a tendency towards cycling
behavior, which is the topic of this paper. AdaBoost's cycling behavior lends
itself to direct computational methods that are ineffective in the general,
non-cycling case of the algorithm. From these computational properties we give
a concrete correspondence between AdaBoost's cycling behavior and continued
fractions dynamics. Then …
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Social Insights & Data Analyst (Freelance)
@ Media.Monks | Jakarta
Cloud Data Engineer
@ Arkatechture | Portland, ME, USA