July 5, 2022, 1:10 a.m. | Tobias Glasmachers

cs.LG updates on arXiv.org arxiv.org

Support vector machines (SVMs) are a standard method in the machine learning
toolbox, in particular for tabular data. Non-linear kernel SVMs often deliver
highly accurate predictors, however, at the cost of long training times. That
problem is aggravated by the exponential growth of data volumes over time. It
was tackled in the past mainly by two types of techniques: approximate solvers,
and parallel GPU implementations. In this work, we combine both approaches to
design an extremely fast dual SVM solver. …

arxiv lg recipe scale svm training

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne