Unsupervised Machine Learning on a Hybrid Quantum Computer

J. S. Otterbach, R. Manenti, N. Alidoust, A. Bestwick, M. Block, B. Bloom, S. Caldwell, N. Didier, E. Schuyler Fried, S. Hong, P. Karalekas, C. B. Osborn, A. Papageorge, E. C. Peterson, G. Prawiroatmodjo, et al.

Published:

Machine learning techniques have led to broad adoption of a statistical model
of computing. The statistical distributions natively available on quantum
processors are a superset of those available classically. Harnessing this
attribute has the potential to accelerate or otherwise improve machine learning
relative to purely classical performance. A key challenge toward that goal is
learning to hybridize classical computing resources and traditional learning
techniques with the emerging capabilities of general purpose quantum
processors. Here, we demonstrate such hybridization by training a 19-qubit gate
model processor to solve a clustering problem, a foundational challenge in
unsupervised learning. We use the quantum approximate optimization algorithm in
conjunction with a gradient-free Bayesian optimization to train the quantum
machine. This quantum/classical hybrid algorithm shows robustness to realistic
noise, and we find evidence that classical optimization can be used to train
around both coherent and incoherent imperfections.