all AI news
Stochastic Smoothed Gradient Descent Ascent for Federated Minimax Optimization
April 22, 2024, 4:43 a.m. | Wei Shen, Minhui Huang, Jiawei Zhang, Cong Shen
cs.LG updates on arXiv.org arxiv.org
Abstract: In recent years, federated minimax optimization has attracted growing interest due to its extensive applications in various machine learning tasks. While Smoothed Alternative Gradient Descent Ascent (Smoothed-AGDA) has proved its success in centralized nonconvex minimax optimization, how and whether smoothing technique could be helpful in federated setting remains unexplored. In this paper, we propose a new algorithm termed Federated Stochastic Smoothed Gradient Descent Ascent (FESS-GDA), which utilizes the smoothing technique for federated minimax optimization. We …
abstract applications arxiv cs.it cs.lg gradient machine machine learning math.it math.oc minimax optimization stat.ml stochastic success tasks type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote