Deep-n-Cheap: An Automated Search Framework for Low Complexity Deep Learning

Sourya Dey, Saikrishna C. Kanala, Keith M. Chugg, Peter A. Beerel
Proceedings of The 12th Asian Conference on Machine Learning, PMLR 129:273-288, 2020.

Abstract

We present Deep-n-Cheap – an open-source AutoML framework to search for deep learning models. This search includes both architecture and training hyperparameters, and supports convolutional neural networks and multilayer perceptrons. Our framework is targeted for deployment on both benchmark and custom datasets, and as a result, offers a greater degree of search space customizability as compared to a more limited search over only pre-existing models from literature. We also introduce the technique of ’search transfer’, which demonstrates the generalization capabilities of our models to multiple datasets. Deep-n-Cheap includes a user-customizable complexity penalty which trades off performance with training time or number of parameters. Specifically, our framework results in models offering performance comparable to state-of-the-art while taking 1-2 orders of magnitude less time to train than models from other AutoML and model search frameworks. Additionally, this work investigates and develops various insights regarding the search process. In particular, we show the superiority of a greedy strategy and justify our choice of Bayesian optimization as the primary search methodology over random / grid search.

Cite this Paper


BibTeX
@InProceedings{pmlr-v129-dey20a, title = {Deep-n-Cheap: An Automated Search Framework for Low Complexity Deep Learning}, author = {Dey, Sourya and Kanala, Saikrishna C. and Chugg, Keith M. and Beerel, Peter A.}, booktitle = {Proceedings of The 12th Asian Conference on Machine Learning}, pages = {273--288}, year = {2020}, editor = {Pan, Sinno Jialin and Sugiyama, Masashi}, volume = {129}, series = {Proceedings of Machine Learning Research}, month = {18--20 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v129/dey20a/dey20a.pdf}, url = {https://proceedings.mlr.press/v129/dey20a.html}, abstract = {We present Deep-n-Cheap – an open-source AutoML framework to search for deep learning models. This search includes both architecture and training hyperparameters, and supports convolutional neural networks and multilayer perceptrons. Our framework is targeted for deployment on both benchmark and custom datasets, and as a result, offers a greater degree of search space customizability as compared to a more limited search over only pre-existing models from literature. We also introduce the technique of ’search transfer’, which demonstrates the generalization capabilities of our models to multiple datasets. Deep-n-Cheap includes a user-customizable complexity penalty which trades off performance with training time or number of parameters. Specifically, our framework results in models offering performance comparable to state-of-the-art while taking 1-2 orders of magnitude less time to train than models from other AutoML and model search frameworks. Additionally, this work investigates and develops various insights regarding the search process. In particular, we show the superiority of a greedy strategy and justify our choice of Bayesian optimization as the primary search methodology over random / grid search.} }
Endnote
%0 Conference Paper %T Deep-n-Cheap: An Automated Search Framework for Low Complexity Deep Learning %A Sourya Dey %A Saikrishna C. Kanala %A Keith M. Chugg %A Peter A. Beerel %B Proceedings of The 12th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Sinno Jialin Pan %E Masashi Sugiyama %F pmlr-v129-dey20a %I PMLR %P 273--288 %U https://proceedings.mlr.press/v129/dey20a.html %V 129 %X We present Deep-n-Cheap – an open-source AutoML framework to search for deep learning models. This search includes both architecture and training hyperparameters, and supports convolutional neural networks and multilayer perceptrons. Our framework is targeted for deployment on both benchmark and custom datasets, and as a result, offers a greater degree of search space customizability as compared to a more limited search over only pre-existing models from literature. We also introduce the technique of ’search transfer’, which demonstrates the generalization capabilities of our models to multiple datasets. Deep-n-Cheap includes a user-customizable complexity penalty which trades off performance with training time or number of parameters. Specifically, our framework results in models offering performance comparable to state-of-the-art while taking 1-2 orders of magnitude less time to train than models from other AutoML and model search frameworks. Additionally, this work investigates and develops various insights regarding the search process. In particular, we show the superiority of a greedy strategy and justify our choice of Bayesian optimization as the primary search methodology over random / grid search.
APA
Dey, S., Kanala, S.C., Chugg, K.M. & Beerel, P.A.. (2020). Deep-n-Cheap: An Automated Search Framework for Low Complexity Deep Learning. Proceedings of The 12th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 129:273-288 Available from https://proceedings.mlr.press/v129/dey20a.html.

Related Material