Back to Search Start Over

Statistical generalization performance guarantee for meta-learning with data dependent prior.

Authors :
Liu, Tianyu
Lu, Jie
Yan, Zheng
Zhang, Guangquan
Source :
Neurocomputing. Nov2021, Vol. 465, p391-405. 15p.
Publication Year :
2021

Abstract

• To improve generalization performance, three novel PAC-Bayes meta-learning bounds are proposed. • Based on the ERM method, a PAC-Bayes meta-learning bound with a data-dependent prior is developed. • Its computational complexity is analyzed and experiments illustrates its effectiveness. Meta-learning aims to leverage experience from previous tasks to achieve an effective and fast adaptation ability when encountering new tasks. However, it is unclear how the generalization property applies to new tasks. Probably approximately correct (PAC) Bayes bound theory provides a theoretical framework to analyze the generalization performance for meta-learning with an explicit numerical generalization error upper bound. A tighter upper bound may achieve better generalization performance. However, for the PAC-Bayes meta-learning bound, the prior distribution is selected randomly which results in poor generalization performance. In this paper, we derive three novel generalization error upper bounds for meta-learning based on the PAC-Bayes relative entropy bound. Furthermore, in order to avoid randomly prior distribution, based on the empirical risk minimization (ERM) method, a data-dependent prior for the PAC-Bayes meta-learning bound algorithm is developed and the sample complexity and computational complexity are analyzed. The experiments illustrate that the proposed three PAC-Bayes bounds for meta-learning achieve a competitive generalization guarantee, and the extended PAC-Bayes bound with a data-dependent prior can achieve rapid convergence ability. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
465
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
153322510
Full Text :
https://doi.org/10.1016/j.neucom.2021.09.018