[edit]
EG-ENAS: Efficient and Generalizable Evolutionary Neural Architecture Search for Image Classification
Proceedings of the Fourth International Conference on Automated Machine Learning, PMLR 293:10/1-20, 2025.
Abstract
Neural Architecture Search (NAS) has become a powerful method for automating the design of deep neural networks in various applications. Among the different optimization techniques, evolutionary approaches stand out for their flexibility, robustness, and capacity to explore diverse solutions. However, evaluating neural architectures typically requires training, making NAS resource-intensive and time-consuming. Additionally, many NAS methods lack generalizability, as they are often tested only on a small set of benchmark datasets. To address these two challenges, we propose a new efficient NAS framework based on evolutionary computation, which reuses available pretrained weights and uses proxies to reduce redundant computations. We initially selected a reduced RegNetY search space and incorporated architectural improvements and regularization techniques for training. We developed a dataset-aware augmentation selection method to efficiently identify the best transform for each dataset using zero-cost proxies. Additionally, we propose a ranking regressor to filter low-potential models during initial population sampling. To reduce training time, we introduce a weight-sharing strategy for RegNets that reuses pretrained stages and transfers the stem from parent to child models across generations. Experimental results show that our low-cost (T0) and full EG-ENAS (T6) configurations consistently achieve robust performance across eleven datasets, outperforming Random Search (T1) and simple Evolutionary NAS (T2) with competitive results in under a 24-hour time budget on seven validation datasets. We achieve state-of-the-art accuracy on one and surpass the 2023 Unseen NAS Challenge top scores on four datasets. The code is available at this link: \url{https://github.com/ankilab/EG-ENAS}