Faster Performance Estimation for NAS with Embedding Proximity Score

Gideon Franken, Prabhant Singh, Joaquin Vanschoren
ECMLPKDD Workshop on Meta-Knowledge Transfer, PMLR 191:51-61, 2022.

Abstract

Neural Architecture Search methods generate large amounts of candidate architectures that need training to assess their performance and find an optimal architecture. To minimize the search time we use different performance estimation strategies. The effectiveness of such strategies varies in terms of accuracy and fit and query time. We propose Embedding proximity score (EmProx). EmProx builds a meta-model that maps candidate architectures to a continuous embedding space using an encoder-decoder framework. The performance of candidates is then estimated using weighted kNN based on the embedding vectors of architectures of which the performance is known. Performance estimations of this method are comparable to similar predictors in terms of accuracy while being nearly nine times faster to train compared to similar methods. Benchmarking against other performance estimation strategies currently used shows similar or better accuracy, while being five up to eighty times faster. Code is made publicly available on GitHub

Cite this Paper


BibTeX
@InProceedings{pmlr-v191-franken22a, title = {Faster Performance Estimation for NAS with Embedding Proximity Score}, author = {Franken, Gideon and Singh, Prabhant and Vanschoren, Joaquin}, booktitle = {ECMLPKDD Workshop on Meta-Knowledge Transfer}, pages = {51--61}, year = {2022}, editor = {Brazdil, Pavel and van Rijn, Jan N. and Gouk, Henry and Mohr, Felix}, volume = {191}, series = {Proceedings of Machine Learning Research}, month = {23 Sep}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v191/franken22a/franken22a.pdf}, url = {https://proceedings.mlr.press/v191/franken22a.html}, abstract = {Neural Architecture Search methods generate large amounts of candidate architectures that need training to assess their performance and find an optimal architecture. To minimize the search time we use different performance estimation strategies. The effectiveness of such strategies varies in terms of accuracy and fit and query time. We propose Embedding proximity score (EmProx). EmProx builds a meta-model that maps candidate architectures to a continuous embedding space using an encoder-decoder framework. The performance of candidates is then estimated using weighted kNN based on the embedding vectors of architectures of which the performance is known. Performance estimations of this method are comparable to similar predictors in terms of accuracy while being nearly nine times faster to train compared to similar methods. Benchmarking against other performance estimation strategies currently used shows similar or better accuracy, while being five up to eighty times faster. Code is made publicly available on GitHub} }
Endnote
%0 Conference Paper %T Faster Performance Estimation for NAS with Embedding Proximity Score %A Gideon Franken %A Prabhant Singh %A Joaquin Vanschoren %B ECMLPKDD Workshop on Meta-Knowledge Transfer %C Proceedings of Machine Learning Research %D 2022 %E Pavel Brazdil %E Jan N. van Rijn %E Henry Gouk %E Felix Mohr %F pmlr-v191-franken22a %I PMLR %P 51--61 %U https://proceedings.mlr.press/v191/franken22a.html %V 191 %X Neural Architecture Search methods generate large amounts of candidate architectures that need training to assess their performance and find an optimal architecture. To minimize the search time we use different performance estimation strategies. The effectiveness of such strategies varies in terms of accuracy and fit and query time. We propose Embedding proximity score (EmProx). EmProx builds a meta-model that maps candidate architectures to a continuous embedding space using an encoder-decoder framework. The performance of candidates is then estimated using weighted kNN based on the embedding vectors of architectures of which the performance is known. Performance estimations of this method are comparable to similar predictors in terms of accuracy while being nearly nine times faster to train compared to similar methods. Benchmarking against other performance estimation strategies currently used shows similar or better accuracy, while being five up to eighty times faster. Code is made publicly available on GitHub
APA
Franken, G., Singh, P. & Vanschoren, J.. (2022). Faster Performance Estimation for NAS with Embedding Proximity Score. ECMLPKDD Workshop on Meta-Knowledge Transfer, in Proceedings of Machine Learning Research 191:51-61 Available from https://proceedings.mlr.press/v191/franken22a.html.

Related Material