Physics & Astronomy ETDs
Publication Date
Fall 12-1-2018
Abstract
In recent years, quantum information processors (QIPs) have grown from one or two qubits to tens of qubits. As a result, characterizing QIPs – measuring how well they work, and how they fail – has become much more challenging. The obstacles to characterizing today’s QIPs will grow even more difficult as QIPs grow from tens of qubits to hundreds, and enter what has been called the “noisy, intermediate-scale quantum” (NISQ) era. This thesis develops methods based on advanced statistics and machine learning algorithms to address the difficulties of “quantum character- ization, validation, and verification” (QCVV) of NISQ processors. In the first part of this thesis, I use statistical model selection to develop techniques for choosing between several models for a QIPs behavior. In the second part, I deploy machine learning algorithms to develop a new QCVV technique and to do experiment design. These investigations help lay a foundation for extending QCVV to characterize the next generation of NISQ processors.
Degree Name
Physics
Level of Degree
Doctoral
Department Name
Physics & Astronomy
First Committee Member (Chair)
Robin Blume-Kohout
Second Committee Member
Carlton M. Caves
Third Committee Member
Francisco Elohim Becerra
Fourth Committee Member
Gabriel Huerta
Language
English
Keywords
quantum tomography, qcvv, machine learning, NISQ, QIP
Document Type
Dissertation
Recommended Citation
Scholten, Travis Luke. "Towards Scalable Characterization of Noisy, Intermediate-Scale Quantum Information Processors." (2018). https://digitalrepository.unm.edu/phyc_etds/205