Web: http://arxiv.org/abs/2205.02215

May 5, 2022, 1:12 a.m. | Davoud Ataee Tarzanagh, Mingchen Li, Christos Thrampoulidis, Samet Oymak

cs.LG updates on arXiv.org arxiv.org

Standard federated optimization methods successfully apply to stochastic
problems with \textit{single-level} structure. However, many contemporary ML
problems -- including adversarial robustness, hyperparameter tuning, and
actor-critic -- fall under nested bilevel programming that subsumes minimax and
compositional optimization. In this work, we propose FedNest: A federated
alternating stochastic gradient method to address general nested problems. We
establish provable convergence rates for FedNest in the presence of
heterogeneous data and introduce variations for bilevel, minimax, and
compositional optimization. FedNest introduces multiple innovations …

arxiv minimax optimization

More from arxiv.org / cs.LG updates on arXiv.org

Data Analyst, Patagonia Action Works

@ Patagonia | Remote

Data & Insights Strategy & Innovation General Manager

@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX

Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis

@ Ahmedabad University | Ahmedabad, India

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC