Breaking the Curse of Space Explosion: Towards Efficient NAS with Curriculum Search

Yong Guo, Yaofo Chen, Yin Zheng, Peilin Zhao, Jian Chen, Junzhou Huang, Mingkui Tan
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:3822-3831, 2020.

Abstract

Neural architecture search (NAS) has become an important approach to automatically find effective architectures. To cover all possible good architectures, we need to search in an extremely large search space with billions of candidate architectures. More critically, given a large search space, we may face a very challenging issue of space explosion. However, due to the limitation of computational resources, we can only sample a very small proportion of the architectures, which provides insufficient information for the training. As a result, existing methods may often produce sub-optimal architectures. To alleviate this issue, we propose a curriculum search method that starts from a small search space and gradually incorporates the learned knowledge to guide the search in a large space. With the proposed search strategy, our Curriculum Neural Architecture Search (CNAS) method significantly improves the search efficiency and finds better architectures than existing NAS methods. Extensive experiments on CIFAR-10 and ImageNet demonstrate the effectiveness of the proposed method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-guo20b, title = {Breaking the Curse of Space Explosion: Towards Efficient {NAS} with Curriculum Search}, author = {Guo, Yong and Chen, Yaofo and Zheng, Yin and Zhao, Peilin and Chen, Jian and Huang, Junzhou and Tan, Mingkui}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {3822--3831}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/guo20b/guo20b.pdf}, url = {https://proceedings.mlr.press/v119/guo20b.html}, abstract = {Neural architecture search (NAS) has become an important approach to automatically find effective architectures. To cover all possible good architectures, we need to search in an extremely large search space with billions of candidate architectures. More critically, given a large search space, we may face a very challenging issue of space explosion. However, due to the limitation of computational resources, we can only sample a very small proportion of the architectures, which provides insufficient information for the training. As a result, existing methods may often produce sub-optimal architectures. To alleviate this issue, we propose a curriculum search method that starts from a small search space and gradually incorporates the learned knowledge to guide the search in a large space. With the proposed search strategy, our Curriculum Neural Architecture Search (CNAS) method significantly improves the search efficiency and finds better architectures than existing NAS methods. Extensive experiments on CIFAR-10 and ImageNet demonstrate the effectiveness of the proposed method.} }
Endnote
%0 Conference Paper %T Breaking the Curse of Space Explosion: Towards Efficient NAS with Curriculum Search %A Yong Guo %A Yaofo Chen %A Yin Zheng %A Peilin Zhao %A Jian Chen %A Junzhou Huang %A Mingkui Tan %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-guo20b %I PMLR %P 3822--3831 %U https://proceedings.mlr.press/v119/guo20b.html %V 119 %X Neural architecture search (NAS) has become an important approach to automatically find effective architectures. To cover all possible good architectures, we need to search in an extremely large search space with billions of candidate architectures. More critically, given a large search space, we may face a very challenging issue of space explosion. However, due to the limitation of computational resources, we can only sample a very small proportion of the architectures, which provides insufficient information for the training. As a result, existing methods may often produce sub-optimal architectures. To alleviate this issue, we propose a curriculum search method that starts from a small search space and gradually incorporates the learned knowledge to guide the search in a large space. With the proposed search strategy, our Curriculum Neural Architecture Search (CNAS) method significantly improves the search efficiency and finds better architectures than existing NAS methods. Extensive experiments on CIFAR-10 and ImageNet demonstrate the effectiveness of the proposed method.
APA
Guo, Y., Chen, Y., Zheng, Y., Zhao, P., Chen, J., Huang, J. & Tan, M.. (2020). Breaking the Curse of Space Explosion: Towards Efficient NAS with Curriculum Search. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:3822-3831 Available from https://proceedings.mlr.press/v119/guo20b.html.

Related Material