A quantum algorithm to train neural networks using low-depth circuits

14 Dec 2017  ·  Guillaume Verdon, Michael Broughton, Jacob Biamonte ·

The question has remained open if near-term gate model quantum computers will offer a quantum advantage for practical applications in the pre-fault tolerance noise regime. A class of algorithms which have shown some promise in this regard are the so-called classical-quantum hybrid variational algorithms. Here we develop a low-depth quantum algorithm to train quantum Boltzmann machine neural networks using such variational methods. We introduce a method which employs the quantum approximate optimization algorithm as a subroutine in order to approximately sample from Gibbs states of Ising Hamiltonians. We use this approximate Gibbs sampling to train neural networks for which we demonstrate training convergence for numerically simulated noisy circuits with depolarizing errors of rates of up to 4%.

PDF Abstract