all AI news
Federated Quantum Natural Gradient Descent for Quantum Federated Learning. (arXiv:2209.00564v1 [quant-ph])
Sept. 2, 2022, 1:12 a.m. | Jun Qi
cs.LG updates on arXiv.org arxiv.org
The heart of Quantum Federated Learning (QFL) is associated with a
distributed learning architecture across several local quantum devices and a
more efficient training algorithm for the QFL is expected to minimize the
communication overhead among different quantum participants. In this work, we
put forth an efficient learning algorithm, namely federated quantum natural
gradient descent (FQNGD), applied in a QFL framework which consists of the
variational quantum circuit (VQC)-based quantum neural networks (QNN). The
FQNGD algorithm admits much fewer training …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US