Back to Search Start Over

Generalization Bounds For Meta-Learning: An Information-Theoretic Analysis

Authors :
Chen, Q.
Shui, C.
Mario Marchand
Source :
Scopus-Elsevier
Publication Year :
2021

Abstract

We derive a novel information-theoretic analysis of the generalization property of meta-learning algorithms. Concretely, our analysis proposes a generic understanding of both the conventional learning-to-learn framework and the modern model-agnostic meta-learning (MAML) algorithms. Moreover, we provide a data-dependent generalization bound for a stochastic variant of MAML, which is non-vacuous for deep few-shot learning. As compared to previous bounds that depend on the square norm of gradients, empirical validations on both simulated data and a well-known few-shot benchmark show that our bound is orders of magnitude tighter in most situations.

Details

Language :
English
Database :
OpenAIRE
Journal :
Scopus-Elsevier
Accession number :
edsair.doi.dedup.....91201183efee0caa669a9082d5099c17