HardCoRe-NAS: Hard Constrained diffeRentiable Neural Architecture Search

Niv Nayman, Yonathan Aflalo, Asaf Noy, Lihi Zelnik
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:7979-7990, 2021.

Abstract

Realistic use of neural networks often requires adhering to multiple constraints on latency, energy and memory among others. A popular approach to find fitting networks is through constrained Neural Architecture Search (NAS), however, previous methods enforce the constraint only softly. Therefore, the resulting networks do not exactly adhere to the resource constraint and their accuracy is harmed. In this work we resolve this by introducing Hard Constrained diffeRentiable NAS (HardCoRe-NAS), that is based on an accurate formulation of the expected resource requirement and a scalable search method that satisfies the hard constraint throughout the search. Our experiments show that HardCoRe-NAS generates state-of-the-art architectures, surpassing other NAS methods, while strictly satisfying the hard resource constraints without any tuning required.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-nayman21a, title = {HardCoRe-NAS: Hard Constrained diffeRentiable Neural Architecture Search}, author = {Nayman, Niv and Aflalo, Yonathan and Noy, Asaf and Zelnik, Lihi}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {7979--7990}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/nayman21a/nayman21a.pdf}, url = {https://proceedings.mlr.press/v139/nayman21a.html}, abstract = {Realistic use of neural networks often requires adhering to multiple constraints on latency, energy and memory among others. A popular approach to find fitting networks is through constrained Neural Architecture Search (NAS), however, previous methods enforce the constraint only softly. Therefore, the resulting networks do not exactly adhere to the resource constraint and their accuracy is harmed. In this work we resolve this by introducing Hard Constrained diffeRentiable NAS (HardCoRe-NAS), that is based on an accurate formulation of the expected resource requirement and a scalable search method that satisfies the hard constraint throughout the search. Our experiments show that HardCoRe-NAS generates state-of-the-art architectures, surpassing other NAS methods, while strictly satisfying the hard resource constraints without any tuning required.} }
Endnote
%0 Conference Paper %T HardCoRe-NAS: Hard Constrained diffeRentiable Neural Architecture Search %A Niv Nayman %A Yonathan Aflalo %A Asaf Noy %A Lihi Zelnik %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-nayman21a %I PMLR %P 7979--7990 %U https://proceedings.mlr.press/v139/nayman21a.html %V 139 %X Realistic use of neural networks often requires adhering to multiple constraints on latency, energy and memory among others. A popular approach to find fitting networks is through constrained Neural Architecture Search (NAS), however, previous methods enforce the constraint only softly. Therefore, the resulting networks do not exactly adhere to the resource constraint and their accuracy is harmed. In this work we resolve this by introducing Hard Constrained diffeRentiable NAS (HardCoRe-NAS), that is based on an accurate formulation of the expected resource requirement and a scalable search method that satisfies the hard constraint throughout the search. Our experiments show that HardCoRe-NAS generates state-of-the-art architectures, surpassing other NAS methods, while strictly satisfying the hard resource constraints without any tuning required.
APA
Nayman, N., Aflalo, Y., Noy, A. & Zelnik, L.. (2021). HardCoRe-NAS: Hard Constrained diffeRentiable Neural Architecture Search. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:7979-7990 Available from https://proceedings.mlr.press/v139/nayman21a.html.

Related Material