Nov. 5, 2023, 6:42 a.m. | Wei Shen, Minhui Huang, Jiawei Zhang, Cong Shen

cs.LG updates on arXiv.org arxiv.org

In recent years, federated minimax optimization has attracted growing
interest due to its extensive applications in various machine learning tasks.
While Smoothed Alternative Gradient Descent Ascent (Smoothed-AGDA) has proved
its success in centralized nonconvex minimax optimization, how and whether
smoothing technique could be helpful in federated setting remains unexplored.
In this paper, we propose a new algorithm termed Federated Stochastic Smoothed
Gradient Descent Ascent (FESS-GDA), which utilizes the smoothing technique for
federated minimax optimization. We prove that FESS-GDA can be …

applications arxiv gradient machine machine learning minimax optimization stochastic success tasks

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Business Intelligence Manager

@ Sanofi | Budapest

Principal Engineer, Data (Hybrid)

@ Homebase | Toronto, Ontario, Canada