Understanding the wiring evolution in differentiable neural architecture search

Sirui Xie, Shoukang Hu, Xinjiang Wang, Chunxiao Liu, Jianping Shi, Xunying Liu, Dahua Lin
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:874-882, 2021.

Abstract

Controversy exists on whether differentiable neural architecture search methods discover wiring topology effectively. To understand how wiring topology evolves, we study the underlying mechanism of several existing differentiable NAS frameworks. Our investigation is motivated by three observed searching patterns of differentiable NAS: 1) they search by growing instead of pruning; 2) wider networks are more preferred than deeper ones; 3) no edges are selected in bi-level optimization. To anatomize these phenomena, we propose a unified view on searching algorithms of existing frameworks, transferring the global optimization to local cost minimization. Based on this reformulation, we conduct empirical and theoretical analyses, revealing implicit biases in the cost’s assignment mechanism and evolution dynamics that cause the observed phenomena. These biases indicate strong discrimination towards certain topologies. To this end, we pose questions that future differentiable methods for neural wiring discovery need to confront, hoping to evoke a discussion and rethinking on how much bias has been enforced implicitly in existing NAS methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-xie21a, title = { Understanding the wiring evolution in differentiable neural architecture search }, author = {Xie, Sirui and Hu, Shoukang and Wang, Xinjiang and Liu, Chunxiao and Shi, Jianping and Liu, Xunying and Lin, Dahua}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {874--882}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/xie21a/xie21a.pdf}, url = {https://proceedings.mlr.press/v130/xie21a.html}, abstract = { Controversy exists on whether differentiable neural architecture search methods discover wiring topology effectively. To understand how wiring topology evolves, we study the underlying mechanism of several existing differentiable NAS frameworks. Our investigation is motivated by three observed searching patterns of differentiable NAS: 1) they search by growing instead of pruning; 2) wider networks are more preferred than deeper ones; 3) no edges are selected in bi-level optimization. To anatomize these phenomena, we propose a unified view on searching algorithms of existing frameworks, transferring the global optimization to local cost minimization. Based on this reformulation, we conduct empirical and theoretical analyses, revealing implicit biases in the cost’s assignment mechanism and evolution dynamics that cause the observed phenomena. These biases indicate strong discrimination towards certain topologies. To this end, we pose questions that future differentiable methods for neural wiring discovery need to confront, hoping to evoke a discussion and rethinking on how much bias has been enforced implicitly in existing NAS methods. } }
Endnote
%0 Conference Paper %T Understanding the wiring evolution in differentiable neural architecture search %A Sirui Xie %A Shoukang Hu %A Xinjiang Wang %A Chunxiao Liu %A Jianping Shi %A Xunying Liu %A Dahua Lin %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-xie21a %I PMLR %P 874--882 %U https://proceedings.mlr.press/v130/xie21a.html %V 130 %X Controversy exists on whether differentiable neural architecture search methods discover wiring topology effectively. To understand how wiring topology evolves, we study the underlying mechanism of several existing differentiable NAS frameworks. Our investigation is motivated by three observed searching patterns of differentiable NAS: 1) they search by growing instead of pruning; 2) wider networks are more preferred than deeper ones; 3) no edges are selected in bi-level optimization. To anatomize these phenomena, we propose a unified view on searching algorithms of existing frameworks, transferring the global optimization to local cost minimization. Based on this reformulation, we conduct empirical and theoretical analyses, revealing implicit biases in the cost’s assignment mechanism and evolution dynamics that cause the observed phenomena. These biases indicate strong discrimination towards certain topologies. To this end, we pose questions that future differentiable methods for neural wiring discovery need to confront, hoping to evoke a discussion and rethinking on how much bias has been enforced implicitly in existing NAS methods.
APA
Xie, S., Hu, S., Wang, X., Liu, C., Shi, J., Liu, X. & Lin, D.. (2021). Understanding the wiring evolution in differentiable neural architecture search . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:874-882 Available from https://proceedings.mlr.press/v130/xie21a.html.

Related Material