EENAS: An Efficient Evolutionary Algorithm for Neural Architecture Search

Zheng Jian, Han Wenran, Zhang Ying, Ji Shufan
Proceedings of The 14th Asian Conference on Machine Learning, PMLR 189:1261-1276, 2023.

Abstract

Neural Architecture Search (NAS) has been widely applied to automatic neural architecture design. Traditional NAS methods often evaluate a large number of architectures, leading to expensive computation overhead. To speed-up architecture search, recent NAS methods try to employ network estimation strategies for guidance of promising architecture selection. In this paper, we have proposed an efficient evolutionary algorithm for NAS, which adapts the most advanced proxy of synthetic signal bases for architecture estimation. Extensive experiments show that our method outperforms state-of-the-art NAS methods, on NAS-Bench-101 search space and NAS-Bench-201 search space (CIFAR-10, CIFAR-100 and ImageNet16-120). Compared with existing works, our method could identify better architectures with greatly reduced search time.

Cite this Paper


BibTeX
@InProceedings{pmlr-v189-jian23a, title = {EENAS: An Efficient Evolutionary Algorithm for Neural Architecture Search}, author = {Jian, Zheng and Wenran, Han and Ying, Zhang and Shufan, Ji}, booktitle = {Proceedings of The 14th Asian Conference on Machine Learning}, pages = {1261--1276}, year = {2023}, editor = {Khan, Emtiyaz and Gonen, Mehmet}, volume = {189}, series = {Proceedings of Machine Learning Research}, month = {12--14 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v189/jian23a/jian23a.pdf}, url = {https://proceedings.mlr.press/v189/jian23a.html}, abstract = {Neural Architecture Search (NAS) has been widely applied to automatic neural architecture design. Traditional NAS methods often evaluate a large number of architectures, leading to expensive computation overhead. To speed-up architecture search, recent NAS methods try to employ network estimation strategies for guidance of promising architecture selection. In this paper, we have proposed an efficient evolutionary algorithm for NAS, which adapts the most advanced proxy of synthetic signal bases for architecture estimation. Extensive experiments show that our method outperforms state-of-the-art NAS methods, on NAS-Bench-101 search space and NAS-Bench-201 search space (CIFAR-10, CIFAR-100 and ImageNet16-120). Compared with existing works, our method could identify better architectures with greatly reduced search time.} }
Endnote
%0 Conference Paper %T EENAS: An Efficient Evolutionary Algorithm for Neural Architecture Search %A Zheng Jian %A Han Wenran %A Zhang Ying %A Ji Shufan %B Proceedings of The 14th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Emtiyaz Khan %E Mehmet Gonen %F pmlr-v189-jian23a %I PMLR %P 1261--1276 %U https://proceedings.mlr.press/v189/jian23a.html %V 189 %X Neural Architecture Search (NAS) has been widely applied to automatic neural architecture design. Traditional NAS methods often evaluate a large number of architectures, leading to expensive computation overhead. To speed-up architecture search, recent NAS methods try to employ network estimation strategies for guidance of promising architecture selection. In this paper, we have proposed an efficient evolutionary algorithm for NAS, which adapts the most advanced proxy of synthetic signal bases for architecture estimation. Extensive experiments show that our method outperforms state-of-the-art NAS methods, on NAS-Bench-101 search space and NAS-Bench-201 search space (CIFAR-10, CIFAR-100 and ImageNet16-120). Compared with existing works, our method could identify better architectures with greatly reduced search time.
APA
Jian, Z., Wenran, H., Ying, Z. & Shufan, J.. (2023). EENAS: An Efficient Evolutionary Algorithm for Neural Architecture Search. Proceedings of The 14th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 189:1261-1276 Available from https://proceedings.mlr.press/v189/jian23a.html.

Related Material