all AI news
Scrap Your Schedules with PopDescent
April 26, 2024, 4:42 a.m. | Abhinav Pomalapally, Bassel El Mabsout, Renato Mansuco
cs.LG updates on arXiv.org arxiv.org
Abstract: In contemporary machine learning workloads, numerous hyper-parameter search algorithms are frequently utilized to efficiently discover high-performing hyper-parameter values, such as learning and regularization rates. As a result, a range of parameter schedules have been designed to leverage the capability of adjusting hyper-parameters during training to enhance loss performance. These schedules, however, introduce new hyper-parameters to be searched and do not account for the current loss values of the models being trained.
To address these issues, …
abstract adjusting algorithms arxiv capability cs.lg loss machine machine learning parameters performance regularization search search algorithms training type values workloads
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Data Scientist (Database Development)
@ Nasdaq | Bengaluru-Affluence