Transferrable Surrogates in Expressive Neural Architecture Search Spaces

Shiwen Qin, Gabriela Kadlecová, Martin Pilát, Shay B Cohen, Roman Neruda, Elliot J. Crowley, Jovita Lukasik, Linus Ericsson
Proceedings of the Fourth International Conference on Automated Machine Learning, PMLR 293:16/1-29, 2025.

Abstract

Neural architecture search (NAS) faces a challenge in balancing the exploration of expressive, broad search spaces that enable architectural innovation with the need for efficient evaluation of architectures to effectively search such spaces. We investigate surrogate model training for improving search in highly expressive NAS search spaces based on context-free grammars. We show that i) surrogate models trained either using zero-cost-proxy metrics and neural graph features (GRAF) or by fine-tuning an off-the-shelf LM have high predictive power for the performance of architectures both within and across datasets, ii) these surrogates can be used to filter out bad architectures when searching on novel datasets, thereby significantly speeding up search and achieving better final performances, and iii) the surrogates can be further used directly as the search objective for huge speed-ups.

Cite this Paper


BibTeX
@InProceedings{pmlr-v293-qin25a, title = {Transferrable Surrogates in Expressive Neural Architecture Search Spaces}, author = {Qin, Shiwen and Kadlecov\'a, Gabriela and Pil\'at, Martin and Cohen, Shay B and Neruda, Roman and Crowley, Elliot J. and Lukasik, Jovita and Ericsson, Linus}, booktitle = {Proceedings of the Fourth International Conference on Automated Machine Learning}, pages = {16/1--29}, year = {2025}, editor = {Akoglu, Leman and Doerr, Carola and van Rijn, Jan N. and Garnett, Roman and Gardner, Jacob R.}, volume = {293}, series = {Proceedings of Machine Learning Research}, month = {08--11 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v293/main/assets/qin25a/qin25a.pdf}, url = {https://proceedings.mlr.press/v293/qin25a.html}, abstract = {Neural architecture search (NAS) faces a challenge in balancing the exploration of expressive, broad search spaces that enable architectural innovation with the need for efficient evaluation of architectures to effectively search such spaces. We investigate surrogate model training for improving search in highly expressive NAS search spaces based on context-free grammars. We show that i) surrogate models trained either using zero-cost-proxy metrics and neural graph features (GRAF) or by fine-tuning an off-the-shelf LM have high predictive power for the performance of architectures both within and across datasets, ii) these surrogates can be used to filter out bad architectures when searching on novel datasets, thereby significantly speeding up search and achieving better final performances, and iii) the surrogates can be further used directly as the search objective for huge speed-ups.} }
Endnote
%0 Conference Paper %T Transferrable Surrogates in Expressive Neural Architecture Search Spaces %A Shiwen Qin %A Gabriela Kadlecová %A Martin Pilát %A Shay B Cohen %A Roman Neruda %A Elliot J. Crowley %A Jovita Lukasik %A Linus Ericsson %B Proceedings of the Fourth International Conference on Automated Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Leman Akoglu %E Carola Doerr %E Jan N. van Rijn %E Roman Garnett %E Jacob R. Gardner %F pmlr-v293-qin25a %I PMLR %P 16/1--29 %U https://proceedings.mlr.press/v293/qin25a.html %V 293 %X Neural architecture search (NAS) faces a challenge in balancing the exploration of expressive, broad search spaces that enable architectural innovation with the need for efficient evaluation of architectures to effectively search such spaces. We investigate surrogate model training for improving search in highly expressive NAS search spaces based on context-free grammars. We show that i) surrogate models trained either using zero-cost-proxy metrics and neural graph features (GRAF) or by fine-tuning an off-the-shelf LM have high predictive power for the performance of architectures both within and across datasets, ii) these surrogates can be used to filter out bad architectures when searching on novel datasets, thereby significantly speeding up search and achieving better final performances, and iii) the surrogates can be further used directly as the search objective for huge speed-ups.
APA
Qin, S., Kadlecová, G., Pilát, M., Cohen, S.B., Neruda, R., Crowley, E.J., Lukasik, J. & Ericsson, L.. (2025). Transferrable Surrogates in Expressive Neural Architecture Search Spaces. Proceedings of the Fourth International Conference on Automated Machine Learning, in Proceedings of Machine Learning Research 293:16/1-29 Available from https://proceedings.mlr.press/v293/qin25a.html.

Related Material