Hierarchically Structured Meta-learning

Huaxiu Yao, Ying Wei, Junzhou Huang, Zhenhui Li
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:7045-7054, 2019.

Abstract

In order to learn quickly with few samples, meta-learning utilizes prior knowledge learned from previous tasks. However, a critical challenge in meta-learning is task uncertainty and heterogeneity, which can not be handled via globally sharing knowledge among tasks. In this paper, based on gradient-based meta-learning, we propose a hierarchically structured meta-learning (HSML) algorithm that explicitly tailors the transferable knowledge to different clusters of tasks. Inspired by the way human beings organize knowledge, we resort to a hierarchical task clustering structure to cluster tasks. As a result, the proposed approach not only addresses the challenge via the knowledge customization to different clusters of tasks, but also preserves knowledge generalization among a cluster of similar tasks. To tackle the changing of task relationship, in addition, we extend the hierarchical structure to a continual learning environment. The experimental results show that our approach can achieve state-of-the-art performance in both toy-regression and few-shot image classification problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-yao19b, title = {Hierarchically Structured Meta-learning}, author = {Yao, Huaxiu and Wei, Ying and Huang, Junzhou and Li, Zhenhui}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {7045--7054}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/yao19b/yao19b.pdf}, url = {http://proceedings.mlr.press/v97/yao19b.html}, abstract = {In order to learn quickly with few samples, meta-learning utilizes prior knowledge learned from previous tasks. However, a critical challenge in meta-learning is task uncertainty and heterogeneity, which can not be handled via globally sharing knowledge among tasks. In this paper, based on gradient-based meta-learning, we propose a hierarchically structured meta-learning (HSML) algorithm that explicitly tailors the transferable knowledge to different clusters of tasks. Inspired by the way human beings organize knowledge, we resort to a hierarchical task clustering structure to cluster tasks. As a result, the proposed approach not only addresses the challenge via the knowledge customization to different clusters of tasks, but also preserves knowledge generalization among a cluster of similar tasks. To tackle the changing of task relationship, in addition, we extend the hierarchical structure to a continual learning environment. The experimental results show that our approach can achieve state-of-the-art performance in both toy-regression and few-shot image classification problems.} }
Endnote
%0 Conference Paper %T Hierarchically Structured Meta-learning %A Huaxiu Yao %A Ying Wei %A Junzhou Huang %A Zhenhui Li %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-yao19b %I PMLR %P 7045--7054 %U http://proceedings.mlr.press/v97/yao19b.html %V 97 %X In order to learn quickly with few samples, meta-learning utilizes prior knowledge learned from previous tasks. However, a critical challenge in meta-learning is task uncertainty and heterogeneity, which can not be handled via globally sharing knowledge among tasks. In this paper, based on gradient-based meta-learning, we propose a hierarchically structured meta-learning (HSML) algorithm that explicitly tailors the transferable knowledge to different clusters of tasks. Inspired by the way human beings organize knowledge, we resort to a hierarchical task clustering structure to cluster tasks. As a result, the proposed approach not only addresses the challenge via the knowledge customization to different clusters of tasks, but also preserves knowledge generalization among a cluster of similar tasks. To tackle the changing of task relationship, in addition, we extend the hierarchical structure to a continual learning environment. The experimental results show that our approach can achieve state-of-the-art performance in both toy-regression and few-shot image classification problems.
APA
Yao, H., Wei, Y., Huang, J. & Li, Z.. (2019). Hierarchically Structured Meta-learning. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:7045-7054 Available from http://proceedings.mlr.press/v97/yao19b.html.

Related Material