Meta Networks

Tsendsuren Munkhdalai, Hong Yu
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:2554-2563, 2017.

Abstract

Neural networks have been successfully applied in applications with a large amount of labeled data. However, the task of rapid generalization on new concepts with small training data while preserving performances on previously learned ones still presents a significant challenge to neural network models. In this work, we introduce a novel meta learning method, Meta Networks (MetaNet), that learns a meta-level knowledge across tasks and shifts its inductive biases via fast parameterization for rapid generalization. When evaluated on Omniglot and Mini-ImageNet benchmarks, our MetaNet models achieve a near human-level performance and outperform the baseline approaches by up to 6\% accuracy. We demonstrate several appealing properties of MetaNet relating to generalization and continual learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-munkhdalai17a, title = {Meta Networks}, author = {Tsendsuren Munkhdalai and Hong Yu}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {2554--2563}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/munkhdalai17a/munkhdalai17a.pdf}, url = {https://proceedings.mlr.press/v70/munkhdalai17a.html}, abstract = {Neural networks have been successfully applied in applications with a large amount of labeled data. However, the task of rapid generalization on new concepts with small training data while preserving performances on previously learned ones still presents a significant challenge to neural network models. In this work, we introduce a novel meta learning method, Meta Networks (MetaNet), that learns a meta-level knowledge across tasks and shifts its inductive biases via fast parameterization for rapid generalization. When evaluated on Omniglot and Mini-ImageNet benchmarks, our MetaNet models achieve a near human-level performance and outperform the baseline approaches by up to 6\% accuracy. We demonstrate several appealing properties of MetaNet relating to generalization and continual learning.} }
Endnote
%0 Conference Paper %T Meta Networks %A Tsendsuren Munkhdalai %A Hong Yu %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-munkhdalai17a %I PMLR %P 2554--2563 %U https://proceedings.mlr.press/v70/munkhdalai17a.html %V 70 %X Neural networks have been successfully applied in applications with a large amount of labeled data. However, the task of rapid generalization on new concepts with small training data while preserving performances on previously learned ones still presents a significant challenge to neural network models. In this work, we introduce a novel meta learning method, Meta Networks (MetaNet), that learns a meta-level knowledge across tasks and shifts its inductive biases via fast parameterization for rapid generalization. When evaluated on Omniglot and Mini-ImageNet benchmarks, our MetaNet models achieve a near human-level performance and outperform the baseline approaches by up to 6\% accuracy. We demonstrate several appealing properties of MetaNet relating to generalization and continual learning.
APA
Munkhdalai, T. & Yu, H.. (2017). Meta Networks. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:2554-2563 Available from https://proceedings.mlr.press/v70/munkhdalai17a.html.

Related Material