Analytical Guarantees on Numerical Precision of Deep Neural Networks

Charbel Sakr, Yongjune Kim, Naresh Shanbhag
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:3007-3016, 2017.

Abstract

The acclaimed successes of neural networks often overshadow their tremendous complexity. We focus on numerical precision – a key parameter defining the complexity of neural networks. First, we present theoretical bounds on the accuracy in presence of limited precision. Interestingly, these bounds can be computed via the back-propagation algorithm. Hence, by combining our theoretical analysis and the back-propagation algorithm, we are able to readily determine the minimum precision needed to preserve accuracy without having to resort to time-consuming fixed-point simulations. We provide numerical evidence showing how our approach allows us to maintain high accuracy but with lower complexity than state-of-the-art binary networks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-sakr17a, title = {Analytical Guarantees on Numerical Precision of Deep Neural Networks}, author = {Charbel Sakr and Yongjune Kim and Naresh Shanbhag}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {3007--3016}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/sakr17a/sakr17a.pdf}, url = {https://proceedings.mlr.press/v70/sakr17a.html}, abstract = {The acclaimed successes of neural networks often overshadow their tremendous complexity. We focus on numerical precision – a key parameter defining the complexity of neural networks. First, we present theoretical bounds on the accuracy in presence of limited precision. Interestingly, these bounds can be computed via the back-propagation algorithm. Hence, by combining our theoretical analysis and the back-propagation algorithm, we are able to readily determine the minimum precision needed to preserve accuracy without having to resort to time-consuming fixed-point simulations. We provide numerical evidence showing how our approach allows us to maintain high accuracy but with lower complexity than state-of-the-art binary networks.} }
Endnote
%0 Conference Paper %T Analytical Guarantees on Numerical Precision of Deep Neural Networks %A Charbel Sakr %A Yongjune Kim %A Naresh Shanbhag %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-sakr17a %I PMLR %P 3007--3016 %U https://proceedings.mlr.press/v70/sakr17a.html %V 70 %X The acclaimed successes of neural networks often overshadow their tremendous complexity. We focus on numerical precision – a key parameter defining the complexity of neural networks. First, we present theoretical bounds on the accuracy in presence of limited precision. Interestingly, these bounds can be computed via the back-propagation algorithm. Hence, by combining our theoretical analysis and the back-propagation algorithm, we are able to readily determine the minimum precision needed to preserve accuracy without having to resort to time-consuming fixed-point simulations. We provide numerical evidence showing how our approach allows us to maintain high accuracy but with lower complexity than state-of-the-art binary networks.
APA
Sakr, C., Kim, Y. & Shanbhag, N.. (2017). Analytical Guarantees on Numerical Precision of Deep Neural Networks. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:3007-3016 Available from https://proceedings.mlr.press/v70/sakr17a.html.

Related Material