Generalization Error in Quantum Machine Learning in the Presence of Sampling Noise
ORAL
Abstract
Tackling sampling noise is an unavoidable challenge when extracting information in machine learning with physical systems. Eigentask Learning was developed in recent work as a framework for learning in the presence of sampling noise [1]. In that work, numerical evidence was presented that extracting low-noise eigentasks can lead to improved performance for machine learning tasks displaying robustness to overfitting, increasing generalization accuracy. The issue of characterizing generalization errors in situations where the training dataset is finite remains unresolved. In this study, we employ methodologies from statistical mechanics to calculate the learning curve of a generic quantum machine learning system, when the training set contains a large but finite number of samples. Our analytical findings, supported by numerical validation, offer solid justification that Eigentask Learning provides optimal learning, in the sense of minimizing generalization errors.
* This research was developed with funding from the AFOSR award FA9550-20-1-0177, AFOSR MURI award FA9550-22-1-0203 and DARPA contract HR00112190072. The views, opinions, and findings expressed are solely the authors' and not the U.S. government's.
–
Publication: [1] F. Hu, et. al., Phys. Rev. X 13, 041020 (2023)
Presenters
-
Fangjun Hu
Princeton University
Authors
-
Fangjun Hu
Princeton University
-
Xun Gao
University of Colorado, Boulder, University of Colorado Boulder
-
Hakan E Tureci
Princeton University