June 27, 2024, 4:49 a.m. | Zishan Shao, Aditya Devarakonda

stat.ML updates on arXiv.org arxiv.org

arXiv:2406.18001v1 Announce Type: cross
Abstract: Dual Coordinate Descent (DCD) and Block Dual Coordinate Descent (BDCD) are important iterative methods for solving convex optimization problems. In this work, we develop scalable DCD and BDCD methods for the kernel support vector machines (K-SVM) and kernel ridge regression (K-RR) problems. On distributed-memory parallel machines the scalability of these methods is limited by the need to communicate every iteration. On modern hardware where communication is orders of magnitude more expensive, the running time of …

abstract arxiv block cs.dc distributed important iterative kernel machines memory optimization regression ridge scalability scalable stat.ml support support vector machines svm type vector work

VP, Enterprise Applications

@ Blue Yonder | Scottsdale

Data Scientist - Moloco Commerce Media

@ Moloco | Redwood City, California, United States

Senior Backend Engineer (New York)

@ Kalepa | New York City. Hybrid

Senior Backend Engineer (USA)

@ Kalepa | New York City. Remote US.

Senior Full Stack Engineer (USA)

@ Kalepa | New York City. Remote US.

Senior Full Stack Engineer (New York)

@ Kalepa | New York City., Hybrid