all AI news
Similarity, Compression and Local Steps: Three Pillars of Efficient Communications for Distributed Variational Inequalities
April 2, 2024, 7:44 p.m. | Aleksandr Beznosikov, Martin Tak\'a\v{c}, Alexander Gasnikov
cs.LG updates on arXiv.org arxiv.org
Abstract: Variational inequalities are a broad and flexible class of problems that includes minimization, saddle point, and fixed point problems as special cases. Therefore, variational inequalities are used in various applications ranging from equilibrium search to adversarial learning. With the increasing size of data and models, today's instances demand parallel and distributed computing for real-world machine learning problems, most of which can be represented as variational inequalities. Meanwhile, most distributed approaches have a significant bottleneck - …
abstract adversarial adversarial learning applications arxiv cases class communications compression cs.dc cs.gt cs.lg distributed equilibrium math.oc search stat.ml type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Risk Management - Machine Learning and Model Delivery Services, Product Associate - Senior Associate-
@ JPMorgan Chase & Co. | Wilmington, DE, United States
Senior ML Engineer (Speech/ASR)
@ ObserveAI | Bengaluru