A Unified View on PAC-Bayes Bounds for Meta-Learning

Arezou Rezazadeh
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:18576-18595, 2022.

Abstract

Meta learning automatically infers an inductive bias, that includes the hyperparameter of the baselearning algorithm, by observing data from a finite number of related tasks. This paper studies PAC-Bayes bounds on meta generalization gap. The meta-generalization gap comprises two sources of generalization gaps: the environmentlevel and task-level gaps resulting from observation of a finite number of tasks and data samples per task, respectively. In this paper, by upper bounding arbitrary convex functions, which link the expected and empirical losses at the environment and also per-task levels, we obtain new PAC-Bayes bounds. Using these bounds, we develop new PAC-Bayes meta-learning algorithms. Numerical examples demonstrate the merits of the proposed novel bounds and algorithm in comparison to prior PAC-Bayes bounds for meta-learning

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-rezazadeh22a, title = {A Unified View on {PAC}-{B}ayes Bounds for Meta-Learning}, author = {Rezazadeh, Arezou}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {18576--18595}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/rezazadeh22a/rezazadeh22a.pdf}, url = {https://proceedings.mlr.press/v162/rezazadeh22a.html}, abstract = {Meta learning automatically infers an inductive bias, that includes the hyperparameter of the baselearning algorithm, by observing data from a finite number of related tasks. This paper studies PAC-Bayes bounds on meta generalization gap. The meta-generalization gap comprises two sources of generalization gaps: the environmentlevel and task-level gaps resulting from observation of a finite number of tasks and data samples per task, respectively. In this paper, by upper bounding arbitrary convex functions, which link the expected and empirical losses at the environment and also per-task levels, we obtain new PAC-Bayes bounds. Using these bounds, we develop new PAC-Bayes meta-learning algorithms. Numerical examples demonstrate the merits of the proposed novel bounds and algorithm in comparison to prior PAC-Bayes bounds for meta-learning} }
Endnote
%0 Conference Paper %T A Unified View on PAC-Bayes Bounds for Meta-Learning %A Arezou Rezazadeh %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-rezazadeh22a %I PMLR %P 18576--18595 %U https://proceedings.mlr.press/v162/rezazadeh22a.html %V 162 %X Meta learning automatically infers an inductive bias, that includes the hyperparameter of the baselearning algorithm, by observing data from a finite number of related tasks. This paper studies PAC-Bayes bounds on meta generalization gap. The meta-generalization gap comprises two sources of generalization gaps: the environmentlevel and task-level gaps resulting from observation of a finite number of tasks and data samples per task, respectively. In this paper, by upper bounding arbitrary convex functions, which link the expected and empirical losses at the environment and also per-task levels, we obtain new PAC-Bayes bounds. Using these bounds, we develop new PAC-Bayes meta-learning algorithms. Numerical examples demonstrate the merits of the proposed novel bounds and algorithm in comparison to prior PAC-Bayes bounds for meta-learning
APA
Rezazadeh, A.. (2022). A Unified View on PAC-Bayes Bounds for Meta-Learning. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:18576-18595 Available from https://proceedings.mlr.press/v162/rezazadeh22a.html.

Related Material