Understanding the Representation and Computation of Multilayer Perceptrons: A Case Study in Speech Recognition

Tasha Nagamine, Nima Mesgarani
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:2564-2573, 2017.

Abstract

Despite the recent success of deep learning, the nature of the transformations they apply to the input features remains poorly understood. This study provides an empirical framework to study the encoding properties of node activations in various layers of the network, and to construct the exact function applied to each data point in the form of a linear transform. These methods are used to discern and quantify properties of feed-forward neural networks trained to map acoustic features to phoneme labels. We show a selective and nonlinear warping of the feature space, achieved by forming prototypical functions to account for the possible variation of each class. This study provides a joint framework where the properties of node activations and the functions implemented by the network can be linked together.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-nagamine17a, title = {Understanding the Representation and Computation of Multilayer Perceptrons: A Case Study in Speech Recognition}, author = {Tasha Nagamine and Nima Mesgarani}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {2564--2573}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/nagamine17a/nagamine17a.pdf}, url = {https://proceedings.mlr.press/v70/nagamine17a.html}, abstract = {Despite the recent success of deep learning, the nature of the transformations they apply to the input features remains poorly understood. This study provides an empirical framework to study the encoding properties of node activations in various layers of the network, and to construct the exact function applied to each data point in the form of a linear transform. These methods are used to discern and quantify properties of feed-forward neural networks trained to map acoustic features to phoneme labels. We show a selective and nonlinear warping of the feature space, achieved by forming prototypical functions to account for the possible variation of each class. This study provides a joint framework where the properties of node activations and the functions implemented by the network can be linked together.} }
Endnote
%0 Conference Paper %T Understanding the Representation and Computation of Multilayer Perceptrons: A Case Study in Speech Recognition %A Tasha Nagamine %A Nima Mesgarani %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-nagamine17a %I PMLR %P 2564--2573 %U https://proceedings.mlr.press/v70/nagamine17a.html %V 70 %X Despite the recent success of deep learning, the nature of the transformations they apply to the input features remains poorly understood. This study provides an empirical framework to study the encoding properties of node activations in various layers of the network, and to construct the exact function applied to each data point in the form of a linear transform. These methods are used to discern and quantify properties of feed-forward neural networks trained to map acoustic features to phoneme labels. We show a selective and nonlinear warping of the feature space, achieved by forming prototypical functions to account for the possible variation of each class. This study provides a joint framework where the properties of node activations and the functions implemented by the network can be linked together.
APA
Nagamine, T. & Mesgarani, N.. (2017). Understanding the Representation and Computation of Multilayer Perceptrons: A Case Study in Speech Recognition. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:2564-2573 Available from https://proceedings.mlr.press/v70/nagamine17a.html.

Related Material