Bayesian Sparsification of Deep C-valued Networks

Ivan Nazarov, Evgeny Burnaev
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:7230-7242, 2020.

Abstract

With continual miniaturization ever more applications of deep learning can be found in embedded systems, where it is common to encounter data with natural representation in the complex domain. To this end we extend Sparse Variational Dropout to complex-valued neural networks and verify the proposed Bayesian technique by conducting a large numerical study of the performance-compression trade-off of C-valued networks on two tasks: image recognition on MNIST-like and CIFAR10 datasets and music transcription on MusicNet. We replicate the state-of-the-art result by Trabelsi et al. (2018) on MusicNet with a complex-valued network compressed by 50-100x at a small performance penalty.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-nazarov20a, title = {{B}ayesian Sparsification of Deep C-valued Networks}, author = {Nazarov, Ivan and Burnaev, Evgeny}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {7230--7242}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/nazarov20a/nazarov20a.pdf}, url = {https://proceedings.mlr.press/v119/nazarov20a.html}, abstract = {With continual miniaturization ever more applications of deep learning can be found in embedded systems, where it is common to encounter data with natural representation in the complex domain. To this end we extend Sparse Variational Dropout to complex-valued neural networks and verify the proposed Bayesian technique by conducting a large numerical study of the performance-compression trade-off of C-valued networks on two tasks: image recognition on MNIST-like and CIFAR10 datasets and music transcription on MusicNet. We replicate the state-of-the-art result by Trabelsi et al. (2018) on MusicNet with a complex-valued network compressed by 50-100x at a small performance penalty.} }
Endnote
%0 Conference Paper %T Bayesian Sparsification of Deep C-valued Networks %A Ivan Nazarov %A Evgeny Burnaev %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-nazarov20a %I PMLR %P 7230--7242 %U https://proceedings.mlr.press/v119/nazarov20a.html %V 119 %X With continual miniaturization ever more applications of deep learning can be found in embedded systems, where it is common to encounter data with natural representation in the complex domain. To this end we extend Sparse Variational Dropout to complex-valued neural networks and verify the proposed Bayesian technique by conducting a large numerical study of the performance-compression trade-off of C-valued networks on two tasks: image recognition on MNIST-like and CIFAR10 datasets and music transcription on MusicNet. We replicate the state-of-the-art result by Trabelsi et al. (2018) on MusicNet with a complex-valued network compressed by 50-100x at a small performance penalty.
APA
Nazarov, I. & Burnaev, E.. (2020). Bayesian Sparsification of Deep C-valued Networks. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:7230-7242 Available from https://proceedings.mlr.press/v119/nazarov20a.html.

Related Material