StrassenNets: Deep Learning with a Multiplication Budget
[edit]
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:49854994, 2018.
Abstract
A large fraction of the arithmetic operations required to evaluate deep neural networks (DNNs) consists of matrix multiplications, in both convolution and fully connected layers. We perform endtoend learning of lowcost approximations of matrix multiplications in DNN layers by casting matrix multiplications as 2layer sumproduct networks (SPNs) (arithmetic circuits) and learning their (ternary) edge weights from data. The SPNs disentangle multiplication and addition operations and enable us to impose a budget on the number of multiplication operations. Combining our method with knowledge distillation and applying it to image classification DNNs (trained on ImageNet) and language modeling DNNs (using LSTMs), we obtain a firstofakind reduction in number of multiplications (over 99.5%) while maintaining the predictive performance of the fullprecision models. Finally, we demonstrate that the proposed framework is able to rediscover Strassen’s matrix multiplication algorithm, learning to multiply $2 \times 2$ matrices using only 7 multiplications instead of 8.
Related Material


