all AI news
MBGDT:Robust Mini-Batch Gradient Descent. (arXiv:2206.07139v1 [cs.LG])
Web: http://arxiv.org/abs/2206.07139
June 16, 2022, 1:10 a.m. | Hanming Wang, Haozheng Luo, Yue Wang
cs.LG updates on arXiv.org arxiv.org
In high dimensions, most machine learning method perform fragile even there
are a little outliers. To address this, we hope to introduce a new method with
the base learner, such as Bayesian regression or stochastic gradient descent to
solve the problem of the vulnerability in the model. Because the mini-batch
gradient descent allows for a more robust convergence than the batch gradient
descent, we work a method with the mini-batch gradient descent, called
Mini-Batch Gradient Descent with Trimming (MBGDT). Our …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY