December 8, 2022

Improved Boltzmann machines with error corrected quantum annealing

Boltzmann machines are the basis of several deep learning methods that have been successfully applied to both supervised and unsupervised machine learning tasks. These models assume that a dataset is generated according to a Boltzmann distribution, and the goal of the training procedure is to learn the set of parameters that most closely match the input data distribution.

Training such models is difficult due to the intractability of traditional sampling techniques, and proposals using quantum annealers for sampling hope to mitigate the sampling cost. However, real physical devices will inevitably be coupled to the environment, and the strength of this coupling affects the effective temperature of the distributions from which a quantum annealer samples.

To counteract this problem, error correction schemes that can effectively reduce the temperature are needed if there is to be some benefit in using quantum annealing for problems at a larger scale, where one might expect the effective temperature of the device to not be sufficiently low.

To this end, researchers at University of Southern California have applied nested quantum annealing correction (NQAC) to do unsupervised learning with a small bars and stripes dataset, and to do supervised learning with a coarse-grained MNIST dataset. For both datasets they demonstrate improved training and a concomitant effective temperature reduction at higher noise levels relative to the unencoded case. They also find better performance overall with longer anneal times and offer an interpretation of the results based on a comparison to simulated quantum annealing and spin vector Monte Carlo. A counterintuitive aspect of their results is that the output distribution generally becomes less Gibbs-like with increasing nesting level and increasing anneal times, which shows that improved training performance can be achieved without equilibration to the target Gibbs distribution.

Read more.