Back to Search
Start Over
Learning complexity gradually in quantum machine learning models
- Publication Year :
- 2024
-
Abstract
- Quantum machine learning is an emergent field that continues to draw significant interest for its potential to offer improvements over classical algorithms in certain areas. However, training quantum models remains a challenging task, largely because of the difficulty in establishing an effective inductive bias when solving high-dimensional problems. In this work, we propose a training framework that prioritizes informative data points over the entire training set. This approach draws inspiration from classical techniques such as curriculum learning and hard example mining to introduce an additional inductive bias through the training data itself. By selectively focusing on informative samples, we aim to steer the optimization process toward more favorable regions of the parameter space. This data-centric approach complements existing strategies such as warm-start initialization methods, providing an additional pathway to address performance challenges in quantum machine learning. We provide theoretical insights into the benefits of prioritizing informative data for quantum models, and we validate our methodology with numerical experiments on selected recognition tasks of quantum phases of matter. Our findings indicate that this strategy could be a valuable approach for improving the performance of quantum machine learning models.<br />Comment: 10 + 2 pages, 6 figures, 3 tables
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2411.11954
- Document Type :
- Working Paper