all AI news
General Cyclical Training of Neural Networks. (arXiv:2202.08835v2 [cs.LG] UPDATED)
Web: http://arxiv.org/abs/2202.08835
June 17, 2022, 1:13 a.m. | Leslie N. Smith
cs.CV updates on arXiv.org arxiv.org
This paper describes the principle of "General Cyclical Training" in machine
learning, where training starts and ends with "easy training" and the "hard
training" happens during the middle epochs. We propose several manifestations
for training neural networks, including algorithmic examples (via
hyper-parameters and loss functions), data-based examples, and model-based
examples. Specifically, we introduce several novel techniques: cyclical weight
decay, cyclical batch size, cyclical focal loss, cyclical softmax temperature,
cyclical data augmentation, cyclical gradient clipping, and cyclical
semi-supervised learning. In addition, …
More from arxiv.org / cs.CV updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY