Feb. 13, 2024, 5:45 a.m. | Elies Gil-Fuster Jens Eisert Carlos Bravo-Prieto

cs.LG updates on arXiv.org arxiv.org

Quantum machine learning models have shown successful generalization performance even when trained with few data. In this work, through systematic randomization experiments, we show that traditional approaches to understanding generalization fail to explain the behavior of such quantum models. Our experiments reveal that state-of-the-art quantum neural networks accurately fit random states and random labeling of training data. This ability to memorize random data defies current notions of small generalization error, problematizing approaches that build on complexity measures such as the …

art behavior cond-mat.quant-gas cs.lg data machine machine learning machine learning models networks neural networks performance quant-ph quantum quantum neural networks random randomization show state stat.ml through understanding work

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne