\textttconfopt: A Library for Implementation and Evaluation of Gradient-based One-Shot NAS Methods

Abhash Kumar Jha, Shakiba Moradian, Arjun Krishnakumar, Martin Rapp, Frank Hutter
Proceedings of the Fourth International Conference on Automated Machine Learning, PMLR 293:20/1-24, 2025.

Abstract

Gradient-based one-shot neural architecture search (NAS) has significantly reduced the cost of exploring architectural spaces with discrete design choices, such as selecting operations within a model. However, the field faces two major challenges. First, evaluations of gradient-based NAS methods heavily rely on the DARTS benchmark, despite the existence of other available benchmarks. This overreliance has led to saturation, with reported improvements often falling within the margin of noise. Second, implementations of gradient-based one-shot NAS methods are fragmented across disparate repositories, complicating fair and reproducible comparisons and further development. In this paper, we introduce Configurable Optimizer (confopt), an extensible library designed to streamline the development and evaluation of gradient-based one-shot NAS methods. Confopt provides a minimal API that makes it easy for users to integrate new search spaces, while also supporting the decomposition of NAS optimizers into their core components. We use this framework to create a suite of new DARTS-based benchmarks, and combine them with a novel evaluation protocol to reveal a critical flaw in how gradient-based one-shot NAS methods are currently assessed. The code can be found under this link: \url{https://github.com/automl/ConfigurableOptimizer}

Cite this Paper


BibTeX
@InProceedings{pmlr-v293-jha25a, title = {\texttt{confopt}: A Library for Implementation and Evaluation of Gradient-based One-Shot NAS Methods}, author = {Jha, Abhash Kumar and Moradian, Shakiba and Krishnakumar, Arjun and Rapp, Martin and Hutter, Frank}, booktitle = {Proceedings of the Fourth International Conference on Automated Machine Learning}, pages = {20/1--24}, year = {2025}, editor = {Akoglu, Leman and Doerr, Carola and van Rijn, Jan N. and Garnett, Roman and Gardner, Jacob R.}, volume = {293}, series = {Proceedings of Machine Learning Research}, month = {08--11 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v293/main/assets/jha25a/jha25a.pdf}, url = {https://proceedings.mlr.press/v293/jha25a.html}, abstract = {Gradient-based one-shot neural architecture search (NAS) has significantly reduced the cost of exploring architectural spaces with discrete design choices, such as selecting operations within a model. However, the field faces two major challenges. First, evaluations of gradient-based NAS methods heavily rely on the DARTS benchmark, despite the existence of other available benchmarks. This overreliance has led to saturation, with reported improvements often falling within the margin of noise. Second, implementations of gradient-based one-shot NAS methods are fragmented across disparate repositories, complicating fair and reproducible comparisons and further development. In this paper, we introduce Configurable Optimizer (confopt), an extensible library designed to streamline the development and evaluation of gradient-based one-shot NAS methods. Confopt provides a minimal API that makes it easy for users to integrate new search spaces, while also supporting the decomposition of NAS optimizers into their core components. We use this framework to create a suite of new DARTS-based benchmarks, and combine them with a novel evaluation protocol to reveal a critical flaw in how gradient-based one-shot NAS methods are currently assessed. The code can be found under this link: \url{https://github.com/automl/ConfigurableOptimizer}} }
Endnote
%0 Conference Paper %T \textttconfopt: A Library for Implementation and Evaluation of Gradient-based One-Shot NAS Methods %A Abhash Kumar Jha %A Shakiba Moradian %A Arjun Krishnakumar %A Martin Rapp %A Frank Hutter %B Proceedings of the Fourth International Conference on Automated Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Leman Akoglu %E Carola Doerr %E Jan N. van Rijn %E Roman Garnett %E Jacob R. Gardner %F pmlr-v293-jha25a %I PMLR %P 20/1--24 %U https://proceedings.mlr.press/v293/jha25a.html %V 293 %X Gradient-based one-shot neural architecture search (NAS) has significantly reduced the cost of exploring architectural spaces with discrete design choices, such as selecting operations within a model. However, the field faces two major challenges. First, evaluations of gradient-based NAS methods heavily rely on the DARTS benchmark, despite the existence of other available benchmarks. This overreliance has led to saturation, with reported improvements often falling within the margin of noise. Second, implementations of gradient-based one-shot NAS methods are fragmented across disparate repositories, complicating fair and reproducible comparisons and further development. In this paper, we introduce Configurable Optimizer (confopt), an extensible library designed to streamline the development and evaluation of gradient-based one-shot NAS methods. Confopt provides a minimal API that makes it easy for users to integrate new search spaces, while also supporting the decomposition of NAS optimizers into their core components. We use this framework to create a suite of new DARTS-based benchmarks, and combine them with a novel evaluation protocol to reveal a critical flaw in how gradient-based one-shot NAS methods are currently assessed. The code can be found under this link: \url{https://github.com/automl/ConfigurableOptimizer}
APA
Jha, A.K., Moradian, S., Krishnakumar, A., Rapp, M. & Hutter, F.. (2025). \textttconfopt: A Library for Implementation and Evaluation of Gradient-based One-Shot NAS Methods. Proceedings of the Fourth International Conference on Automated Machine Learning, in Proceedings of Machine Learning Research 293:20/1-24 Available from https://proceedings.mlr.press/v293/jha25a.html.

Related Material