Speaker
Details
With the rapid advancement of quantum technologies in recent years, the application of quantum computing to machine learning tasks has attracted growing interest, particularly in the context of the Noisy Intermediate-Scale Quantum (NISQ) era. While quantum machine learning (QML) offers the promise of surpassing classical methods in expressivity and computational power, its practical implementation is hindered by current hardware limitations—most notably, quantum shot noise and limited coherence time. Among the many QML paradigms, Quantum Reservoir Computing (QRC) has emerged as a promising approach. Inspired by classical recurrent neural networks and quantum kernel methods, QRC provides a resource-efficient architecture especially suited to NISQ devices, as it avoids deep quantum circuits and extensive parameter training.
In this thesis, I introduce a theoretical framework—Resolvable Expressive Capacity and Eigentask Analysis—that offers a rigorous foundation for quantifying and mitigating the effects of quantum shot noise in QRC systems. By characterizing the resolvable portion of the model’s expressive capacity under shot noise, this framework not only deepens our understanding of noise in QML but also provides a practical method for tailoring QRC to optimize generalization performance. This analysis further uncovers deep connections between QML, quantum metrology, and quantum dynamics, suggesting new directions for interdisciplinary research. Additionally, I present a hardware-compatible implementation framework, NISQ Reservoir Computing (NISQRC), which leverages partial measurement and deterministic reset to realize QRC with sustained temporal memory that outlasts the native coherence time of the hardware. Both theories are demonstrated on state-of-the-art superconducting quantum platforms, validating their utility in learning and inference.
Finally, I explore a novel approach to quantum optimization that integrates the structural strengths of QRC with the stochastic nature of shot noise. Rather than treating noise solely as an obstacle, this perspective embraces shot noise as a potential resource—particularly useful in stochastic optimization. I propose a method that employs a surrogate loss landscape derived from the reservoir to perform efficient parameter training. This work opens the door to noise-adaptive quantum optimization techniques, offering a practical path forward for high-performance QML in the NISQ era.
Adviser: Hakan Türeci
Zoom Mtg: https://princeton.zoom.us/j/93027521422