Automated Curriculum Learning for Neural Networks

Alex Graves, Marc G. Bellemare, Jacob Menick, Rémi Munos, Koray Kavukcuoglu
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:1311-1320, 2017.

Abstract

We introduce a method for automatically selecting the path, or syllabus, that a neural network follows through a curriculum so as to maximise learning efficiency. A measure of the amount that the network learns from each data sample is provided as a reward signal to a nonstationary multi-armed bandit algorithm, which then determines a stochastic syllabus. We consider a range of signals derived from two distinct indicators of learning progress: rate of increase in prediction accuracy, and rate of increase in network complexity. Experimental results for LSTM networks on three curricula demonstrate that our approach can significantly accelerate learning, in some cases halving the time required to attain a satisfactory performance level.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-graves17a, title = {Automated Curriculum Learning for Neural Networks}, author = {Alex Graves and Marc G. Bellemare and Jacob Menick and R{\'e}mi Munos and Koray Kavukcuoglu}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {1311--1320}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/graves17a/graves17a.pdf}, url = {https://proceedings.mlr.press/v70/graves17a.html}, abstract = {We introduce a method for automatically selecting the path, or syllabus, that a neural network follows through a curriculum so as to maximise learning efficiency. A measure of the amount that the network learns from each data sample is provided as a reward signal to a nonstationary multi-armed bandit algorithm, which then determines a stochastic syllabus. We consider a range of signals derived from two distinct indicators of learning progress: rate of increase in prediction accuracy, and rate of increase in network complexity. Experimental results for LSTM networks on three curricula demonstrate that our approach can significantly accelerate learning, in some cases halving the time required to attain a satisfactory performance level.} }
Endnote
%0 Conference Paper %T Automated Curriculum Learning for Neural Networks %A Alex Graves %A Marc G. Bellemare %A Jacob Menick %A Rémi Munos %A Koray Kavukcuoglu %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-graves17a %I PMLR %P 1311--1320 %U https://proceedings.mlr.press/v70/graves17a.html %V 70 %X We introduce a method for automatically selecting the path, or syllabus, that a neural network follows through a curriculum so as to maximise learning efficiency. A measure of the amount that the network learns from each data sample is provided as a reward signal to a nonstationary multi-armed bandit algorithm, which then determines a stochastic syllabus. We consider a range of signals derived from two distinct indicators of learning progress: rate of increase in prediction accuracy, and rate of increase in network complexity. Experimental results for LSTM networks on three curricula demonstrate that our approach can significantly accelerate learning, in some cases halving the time required to attain a satisfactory performance level.
APA
Graves, A., Bellemare, M.G., Menick, J., Munos, R. & Kavukcuoglu, K.. (2017). Automated Curriculum Learning for Neural Networks. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:1311-1320 Available from https://proceedings.mlr.press/v70/graves17a.html.

Related Material