Analyzing Few-Shot Neural Architecture Search in a Metric-Driven Framework

Timotée Ly-Manson, Mathieu Léonardon, Abdeldjalil Aissa El Bey, Ghouthi Boukli Hacene, Lukas Mauch
Proceedings of the Third International Conference on Automated Machine Learning, PMLR 256:5/1-33, 2024.

Abstract

While Neural Architecture Search (NAS) methods help find optimal neural network architectures for diverse tasks, they often come with unreasonable costs. To tackle such a drawback, the one-shot NAS setting was introduced, where a supernet is used as a superposition of all architectures in the space and performs the search in a single training phase. While this method significantly reduces the cost of running NAS, the joint optimization of every architecture degrades the performance of the search. The few-shot NAS line of work tackles this issue by splitting the supernet into sub-supernets trained separately, each with a reduced level of weight-sharing, which gives rise to the new challenge of finding the best way to split the supernet. In particular, GM-NAS utilizes a gradient matching score to group operations in a splitting schema. We extend and generalize this method by building a framework with compatibility for any arbitrary architecture evaluation metric, enabling the generation of numerous and diverse splits. We leverage this new framework in conjunction with various metrics from the zero-shot NAS literature and investigate the benefits of splitting across algorithms and metrics. We find that architectures are distributed in disadvantageous ways inside splits, and that proposed supernet selection methods are flawed.

Cite this Paper


BibTeX
@InProceedings{pmlr-v256-ly-manson24a, title = {Analyzing Few-Shot Neural Architecture Search in a Metric-Driven Framework}, author = {Ly-Manson, Timot\'ee and L\'eonardon, Mathieu and El Bey, Abdeldjalil Aissa and Hacene, Ghouthi Boukli and Mauch, Lukas}, booktitle = {Proceedings of the Third International Conference on Automated Machine Learning}, pages = {5/1--33}, year = {2024}, editor = {Eggensperger, Katharina and Garnett, Roman and Vanschoren, Joaquin and Lindauer, Marius and Gardner, Jacob R.}, volume = {256}, series = {Proceedings of Machine Learning Research}, month = {09--12 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v256/main/assets/ly-manson24a/ly-manson24a.pdf}, url = {https://proceedings.mlr.press/v256/ly-manson24a.html}, abstract = {While Neural Architecture Search (NAS) methods help find optimal neural network architectures for diverse tasks, they often come with unreasonable costs. To tackle such a drawback, the one-shot NAS setting was introduced, where a supernet is used as a superposition of all architectures in the space and performs the search in a single training phase. While this method significantly reduces the cost of running NAS, the joint optimization of every architecture degrades the performance of the search. The few-shot NAS line of work tackles this issue by splitting the supernet into sub-supernets trained separately, each with a reduced level of weight-sharing, which gives rise to the new challenge of finding the best way to split the supernet. In particular, GM-NAS utilizes a gradient matching score to group operations in a splitting schema. We extend and generalize this method by building a framework with compatibility for any arbitrary architecture evaluation metric, enabling the generation of numerous and diverse splits. We leverage this new framework in conjunction with various metrics from the zero-shot NAS literature and investigate the benefits of splitting across algorithms and metrics. We find that architectures are distributed in disadvantageous ways inside splits, and that proposed supernet selection methods are flawed.} }
Endnote
%0 Conference Paper %T Analyzing Few-Shot Neural Architecture Search in a Metric-Driven Framework %A Timotée Ly-Manson %A Mathieu Léonardon %A Abdeldjalil Aissa El Bey %A Ghouthi Boukli Hacene %A Lukas Mauch %B Proceedings of the Third International Conference on Automated Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Katharina Eggensperger %E Roman Garnett %E Joaquin Vanschoren %E Marius Lindauer %E Jacob R. Gardner %F pmlr-v256-ly-manson24a %I PMLR %P 5/1--33 %U https://proceedings.mlr.press/v256/ly-manson24a.html %V 256 %X While Neural Architecture Search (NAS) methods help find optimal neural network architectures for diverse tasks, they often come with unreasonable costs. To tackle such a drawback, the one-shot NAS setting was introduced, where a supernet is used as a superposition of all architectures in the space and performs the search in a single training phase. While this method significantly reduces the cost of running NAS, the joint optimization of every architecture degrades the performance of the search. The few-shot NAS line of work tackles this issue by splitting the supernet into sub-supernets trained separately, each with a reduced level of weight-sharing, which gives rise to the new challenge of finding the best way to split the supernet. In particular, GM-NAS utilizes a gradient matching score to group operations in a splitting schema. We extend and generalize this method by building a framework with compatibility for any arbitrary architecture evaluation metric, enabling the generation of numerous and diverse splits. We leverage this new framework in conjunction with various metrics from the zero-shot NAS literature and investigate the benefits of splitting across algorithms and metrics. We find that architectures are distributed in disadvantageous ways inside splits, and that proposed supernet selection methods are flawed.
APA
Ly-Manson, T., Léonardon, M., El Bey, A.A., Hacene, G.B. & Mauch, L.. (2024). Analyzing Few-Shot Neural Architecture Search in a Metric-Driven Framework. Proceedings of the Third International Conference on Automated Machine Learning, in Proceedings of Machine Learning Research 256:5/1-33 Available from https://proceedings.mlr.press/v256/ly-manson24a.html.

Related Material