EG-ENAS: Efficient and Generalizable Evolutionary Neural Architecture Search for Image Classification

Mateo Avila Pava, René Groh, Andreas M Kist
Proceedings of the Fourth International Conference on Automated Machine Learning, PMLR 293:10/1-20, 2025.

Abstract

Neural Architecture Search (NAS) has become a powerful method for automating the design of deep neural networks in various applications. Among the different optimization techniques, evolutionary approaches stand out for their flexibility, robustness, and capacity to explore diverse solutions. However, evaluating neural architectures typically requires training, making NAS resource-intensive and time-consuming. Additionally, many NAS methods lack generalizability, as they are often tested only on a small set of benchmark datasets. To address these two challenges, we propose a new efficient NAS framework based on evolutionary computation, which reuses available pretrained weights and uses proxies to reduce redundant computations. We initially selected a reduced RegNetY search space and incorporated architectural improvements and regularization techniques for training. We developed a dataset-aware augmentation selection method to efficiently identify the best transform for each dataset using zero-cost proxies. Additionally, we propose a ranking regressor to filter low-potential models during initial population sampling. To reduce training time, we introduce a weight-sharing strategy for RegNets that reuses pretrained stages and transfers the stem from parent to child models across generations. Experimental results show that our low-cost (T0) and full EG-ENAS (T6) configurations consistently achieve robust performance across eleven datasets, outperforming Random Search (T1) and simple Evolutionary NAS (T2) with competitive results in under a 24-hour time budget on seven validation datasets. We achieve state-of-the-art accuracy on one and surpass the 2023 Unseen NAS Challenge top scores on four datasets. The code is available at this link: \url{https://github.com/ankilab/EG-ENAS}

Cite this Paper


BibTeX
@InProceedings{pmlr-v293-pava25a, title = {EG-ENAS: Efficient and Generalizable Evolutionary Neural Architecture Search for Image Classification}, author = {Pava, Mateo Avila and Groh, Ren\'e and Kist, Andreas M}, booktitle = {Proceedings of the Fourth International Conference on Automated Machine Learning}, pages = {10/1--20}, year = {2025}, editor = {Akoglu, Leman and Doerr, Carola and van Rijn, Jan N. and Garnett, Roman and Gardner, Jacob R.}, volume = {293}, series = {Proceedings of Machine Learning Research}, month = {08--11 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v293/main/assets/pava25a/pava25a.pdf}, url = {https://proceedings.mlr.press/v293/pava25a.html}, abstract = {Neural Architecture Search (NAS) has become a powerful method for automating the design of deep neural networks in various applications. Among the different optimization techniques, evolutionary approaches stand out for their flexibility, robustness, and capacity to explore diverse solutions. However, evaluating neural architectures typically requires training, making NAS resource-intensive and time-consuming. Additionally, many NAS methods lack generalizability, as they are often tested only on a small set of benchmark datasets. To address these two challenges, we propose a new efficient NAS framework based on evolutionary computation, which reuses available pretrained weights and uses proxies to reduce redundant computations. We initially selected a reduced RegNetY search space and incorporated architectural improvements and regularization techniques for training. We developed a dataset-aware augmentation selection method to efficiently identify the best transform for each dataset using zero-cost proxies. Additionally, we propose a ranking regressor to filter low-potential models during initial population sampling. To reduce training time, we introduce a weight-sharing strategy for RegNets that reuses pretrained stages and transfers the stem from parent to child models across generations. Experimental results show that our low-cost (T0) and full EG-ENAS (T6) configurations consistently achieve robust performance across eleven datasets, outperforming Random Search (T1) and simple Evolutionary NAS (T2) with competitive results in under a 24-hour time budget on seven validation datasets. We achieve state-of-the-art accuracy on one and surpass the 2023 Unseen NAS Challenge top scores on four datasets. The code is available at this link: \url{https://github.com/ankilab/EG-ENAS}} }
Endnote
%0 Conference Paper %T EG-ENAS: Efficient and Generalizable Evolutionary Neural Architecture Search for Image Classification %A Mateo Avila Pava %A René Groh %A Andreas M Kist %B Proceedings of the Fourth International Conference on Automated Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Leman Akoglu %E Carola Doerr %E Jan N. van Rijn %E Roman Garnett %E Jacob R. Gardner %F pmlr-v293-pava25a %I PMLR %P 10/1--20 %U https://proceedings.mlr.press/v293/pava25a.html %V 293 %X Neural Architecture Search (NAS) has become a powerful method for automating the design of deep neural networks in various applications. Among the different optimization techniques, evolutionary approaches stand out for their flexibility, robustness, and capacity to explore diverse solutions. However, evaluating neural architectures typically requires training, making NAS resource-intensive and time-consuming. Additionally, many NAS methods lack generalizability, as they are often tested only on a small set of benchmark datasets. To address these two challenges, we propose a new efficient NAS framework based on evolutionary computation, which reuses available pretrained weights and uses proxies to reduce redundant computations. We initially selected a reduced RegNetY search space and incorporated architectural improvements and regularization techniques for training. We developed a dataset-aware augmentation selection method to efficiently identify the best transform for each dataset using zero-cost proxies. Additionally, we propose a ranking regressor to filter low-potential models during initial population sampling. To reduce training time, we introduce a weight-sharing strategy for RegNets that reuses pretrained stages and transfers the stem from parent to child models across generations. Experimental results show that our low-cost (T0) and full EG-ENAS (T6) configurations consistently achieve robust performance across eleven datasets, outperforming Random Search (T1) and simple Evolutionary NAS (T2) with competitive results in under a 24-hour time budget on seven validation datasets. We achieve state-of-the-art accuracy on one and surpass the 2023 Unseen NAS Challenge top scores on four datasets. The code is available at this link: \url{https://github.com/ankilab/EG-ENAS}
APA
Pava, M.A., Groh, R. & Kist, A.M.. (2025). EG-ENAS: Efficient and Generalizable Evolutionary Neural Architecture Search for Image Classification. Proceedings of the Fourth International Conference on Automated Machine Learning, in Proceedings of Machine Learning Research 293:10/1-20 Available from https://proceedings.mlr.press/v293/pava25a.html.

Related Material