all AI news
Limit Cycles of AdaBoost. (arXiv:2209.06928v1 [cs.LG])
Sept. 16, 2022, 1:11 a.m. | Conor Snedeker
cs.LG updates on arXiv.org arxiv.org
The iterative weight update for the AdaBoost machine learning algorithm may
be realized as a dynamical map on a probability simplex. When learning a
low-dimensional data set this algorithm has a tendency towards cycling
behavior, which is the topic of this paper. AdaBoost's cycling behavior lends
itself to direct computational methods that are ineffective in the general,
non-cycling case of the algorithm. From these computational properties we give
a concrete correspondence between AdaBoost's cycling behavior and continued
fractions dynamics. Then …
More from arxiv.org / cs.LG updates on arXiv.org
A Single-Loop Algorithm for Decentralized Bilevel Optimization
1 day, 7 hours ago |
arxiv.org
CLEANing Cygnus A deep and fast with R2D2
1 day, 7 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Management Associate
@ EcoVadis | Ebène, Mauritius
Senior Data Engineer
@ Telstra | Telstra ICC Bengaluru