all AI news
Post-variational quantum neural networks
April 8, 2024, 4:43 a.m. | Po-Wei Huang, Patrick Rebentrost
cs.LG updates on arXiv.org arxiv.org
Abstract: Hybrid quantum-classical computing in the noisy intermediate-scale quantum (NISQ) era with variational algorithms can exhibit barren plateau issues, causing difficult convergence of gradient-based optimization techniques. In this paper, we discuss "post-variational strategies", which shift tunable parameters from the quantum computer to the classical computer, opting for ensemble strategies when optimizing quantum models. We discuss various strategies and design principles for constructing individual quantum circuits, where the resulting ensembles can be optimized with convex programming. Further, …
abstract algorithms arxiv computer computing convergence cs.lg discuss ensemble gradient hybrid intermediate networks neural networks nisq optimization paper parameters quant-ph quantum quantum computer quantum neural networks scale shift strategies type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Senior Machine Learning Engineer
@ BlackStone eIT | Egypt - Remote
Machine Learning Engineer - 2
@ Parspec | Bengaluru, India