DDSAS: Dynamic and Differentiable Space-Architecture Search

Longxing Yang, Yu Hu, Shun Lu, Zihao Sun, Jilin Mei, Yiming Zeng, Zhiping Shi, Yinhe Han, Xiaowei Li
Proceedings of The 13th Asian Conference on Machine Learning, PMLR 157:284-299, 2021.

Abstract

Neural Architecture Search (NAS) has made remarkable progress in automatically designing neural networks. However, existing differentiable NAS and stochastic NAS methods are either biased towards exploitation and thus may converge to a local minimum, or biased towards exploration and thus converge slowly. In this work, we propose a Dynamic and Differentiable Space-Architecture Search (DDSAS) method to address the exploration-exploitation dilemma. DDSAS dynamically samples space, searches architectures in the sampled subspace with gradient descent, and leverages the Upper Confidence Bound (UCB) to balance exploitation and exploration. The whole search space is elastic, offering flexibility to evolve and to consider resource constraints. Experiments on image classification datasets demonstrate that with only 4GB memory and 3 hours for searching, DDSAS achieves 2.39% test error on CIFAR10, 16.26% test error on CIFAR100, and 23.9% test error when transferring to ImageNet. When directly searching on ImageNet, DDSAS achieves comparable accuracy with more than 6.5 times speedup over state-of-the-art methods. The source codes are available at https://github.com/xingxing-123/DDSAS.

Cite this Paper


BibTeX
@InProceedings{pmlr-v157-yang21a, title = {DDSAS: Dynamic and Differentiable Space-Architecture Search}, author = {Yang, Longxing and Hu, Yu and Lu, Shun and Sun, Zihao and Mei, Jilin and Zeng, Yiming and Shi, Zhiping and Han, Yinhe and Li, Xiaowei}, booktitle = {Proceedings of The 13th Asian Conference on Machine Learning}, pages = {284--299}, year = {2021}, editor = {Balasubramanian, Vineeth N. and Tsang, Ivor}, volume = {157}, series = {Proceedings of Machine Learning Research}, month = {17--19 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v157/yang21a/yang21a.pdf}, url = {https://proceedings.mlr.press/v157/yang21a.html}, abstract = {Neural Architecture Search (NAS) has made remarkable progress in automatically designing neural networks. However, existing differentiable NAS and stochastic NAS methods are either biased towards exploitation and thus may converge to a local minimum, or biased towards exploration and thus converge slowly. In this work, we propose a Dynamic and Differentiable Space-Architecture Search (DDSAS) method to address the exploration-exploitation dilemma. DDSAS dynamically samples space, searches architectures in the sampled subspace with gradient descent, and leverages the Upper Confidence Bound (UCB) to balance exploitation and exploration. The whole search space is elastic, offering flexibility to evolve and to consider resource constraints. Experiments on image classification datasets demonstrate that with only 4GB memory and 3 hours for searching, DDSAS achieves 2.39% test error on CIFAR10, 16.26% test error on CIFAR100, and 23.9% test error when transferring to ImageNet. When directly searching on ImageNet, DDSAS achieves comparable accuracy with more than 6.5 times speedup over state-of-the-art methods. The source codes are available at https://github.com/xingxing-123/DDSAS.} }
Endnote
%0 Conference Paper %T DDSAS: Dynamic and Differentiable Space-Architecture Search %A Longxing Yang %A Yu Hu %A Shun Lu %A Zihao Sun %A Jilin Mei %A Yiming Zeng %A Zhiping Shi %A Yinhe Han %A Xiaowei Li %B Proceedings of The 13th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Vineeth N. Balasubramanian %E Ivor Tsang %F pmlr-v157-yang21a %I PMLR %P 284--299 %U https://proceedings.mlr.press/v157/yang21a.html %V 157 %X Neural Architecture Search (NAS) has made remarkable progress in automatically designing neural networks. However, existing differentiable NAS and stochastic NAS methods are either biased towards exploitation and thus may converge to a local minimum, or biased towards exploration and thus converge slowly. In this work, we propose a Dynamic and Differentiable Space-Architecture Search (DDSAS) method to address the exploration-exploitation dilemma. DDSAS dynamically samples space, searches architectures in the sampled subspace with gradient descent, and leverages the Upper Confidence Bound (UCB) to balance exploitation and exploration. The whole search space is elastic, offering flexibility to evolve and to consider resource constraints. Experiments on image classification datasets demonstrate that with only 4GB memory and 3 hours for searching, DDSAS achieves 2.39% test error on CIFAR10, 16.26% test error on CIFAR100, and 23.9% test error when transferring to ImageNet. When directly searching on ImageNet, DDSAS achieves comparable accuracy with more than 6.5 times speedup over state-of-the-art methods. The source codes are available at https://github.com/xingxing-123/DDSAS.
APA
Yang, L., Hu, Y., Lu, S., Sun, Z., Mei, J., Zeng, Y., Shi, Z., Han, Y. & Li, X.. (2021). DDSAS: Dynamic and Differentiable Space-Architecture Search. Proceedings of The 13th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 157:284-299 Available from https://proceedings.mlr.press/v157/yang21a.html.

Related Material