Robust $L_p$-Norm Linear Discriminant Analysis with Proxy Matrix Optimization

Navya Nagananda, Breton Minnehan, Andreas Savakis
Proceedings of Topological, Algebraic, and Geometric Learning Workshops 2022, PMLR 196:277-286, 2022.

Abstract

Linear Discriminant Analysis (LDA) is an established supervised dimensionality reduction method that is traditionally based on the ${L}_2$-norm. However, the standard ${L}_2$-norm LDA is susceptible to outliers in the data that often contribute to a drop in accuracy. Using the ${L}_1$ or fractional $p$-norms makes LDA more robust to outliers, but it is a harder problem to solve due to the nature of the corresponding objective functions. In this paper, we leverage the orthogonal constraint of the Grassmann manifold to iteratively obtain the optimal projection matrix for the data in a lower dimensional space. Instead of optimizing the matrix directly on the manifold, we use the proxy matrix optimization (PMO) method, utilizing an auxiliary matrix in ambient space that is retracted to the closest location on the manifold along the loss minimizing geodesic. The ${L}_p$-LDA-PMO learning is based on backpropagation, which allows easy integration in a neural network and flexibility to change the value of the $p$-norm. Our experiments on synthetic and real data show that using fractional $p$-norms for LDA leads to an improvement in accuracy compared to the traditional ${L}_2$-based LDA.

Cite this Paper


BibTeX
@InProceedings{pmlr-v196-nagananda22a, title = {Robust ${L}_p$-Norm Linear Discriminant Analysis with Proxy Matrix Optimization}, author = {Nagananda, Navya and Minnehan, Breton and Savakis, Andreas}, booktitle = {Proceedings of Topological, Algebraic, and Geometric Learning Workshops 2022}, pages = {277--286}, year = {2022}, editor = {Cloninger, Alexander and Doster, Timothy and Emerson, Tegan and Kaul, Manohar and Ktena, Ira and Kvinge, Henry and Miolane, Nina and Rieck, Bastian and Tymochko, Sarah and Wolf, Guy}, volume = {196}, series = {Proceedings of Machine Learning Research}, month = {25 Feb--22 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v196/nagananda22a/nagananda22a.pdf}, url = {https://proceedings.mlr.press/v196/nagananda22a.html}, abstract = {Linear Discriminant Analysis (LDA) is an established supervised dimensionality reduction method that is traditionally based on the ${L}_2$-norm. However, the standard ${L}_2$-norm LDA is susceptible to outliers in the data that often contribute to a drop in accuracy. Using the ${L}_1$ or fractional $p$-norms makes LDA more robust to outliers, but it is a harder problem to solve due to the nature of the corresponding objective functions. In this paper, we leverage the orthogonal constraint of the Grassmann manifold to iteratively obtain the optimal projection matrix for the data in a lower dimensional space. Instead of optimizing the matrix directly on the manifold, we use the proxy matrix optimization (PMO) method, utilizing an auxiliary matrix in ambient space that is retracted to the closest location on the manifold along the loss minimizing geodesic. The ${L}_p$-LDA-PMO learning is based on backpropagation, which allows easy integration in a neural network and flexibility to change the value of the $p$-norm. Our experiments on synthetic and real data show that using fractional $p$-norms for LDA leads to an improvement in accuracy compared to the traditional ${L}_2$-based LDA.} }
Endnote
%0 Conference Paper %T Robust $L_p$-Norm Linear Discriminant Analysis with Proxy Matrix Optimization %A Navya Nagananda %A Breton Minnehan %A Andreas Savakis %B Proceedings of Topological, Algebraic, and Geometric Learning Workshops 2022 %C Proceedings of Machine Learning Research %D 2022 %E Alexander Cloninger %E Timothy Doster %E Tegan Emerson %E Manohar Kaul %E Ira Ktena %E Henry Kvinge %E Nina Miolane %E Bastian Rieck %E Sarah Tymochko %E Guy Wolf %F pmlr-v196-nagananda22a %I PMLR %P 277--286 %U https://proceedings.mlr.press/v196/nagananda22a.html %V 196 %X Linear Discriminant Analysis (LDA) is an established supervised dimensionality reduction method that is traditionally based on the ${L}_2$-norm. However, the standard ${L}_2$-norm LDA is susceptible to outliers in the data that often contribute to a drop in accuracy. Using the ${L}_1$ or fractional $p$-norms makes LDA more robust to outliers, but it is a harder problem to solve due to the nature of the corresponding objective functions. In this paper, we leverage the orthogonal constraint of the Grassmann manifold to iteratively obtain the optimal projection matrix for the data in a lower dimensional space. Instead of optimizing the matrix directly on the manifold, we use the proxy matrix optimization (PMO) method, utilizing an auxiliary matrix in ambient space that is retracted to the closest location on the manifold along the loss minimizing geodesic. The ${L}_p$-LDA-PMO learning is based on backpropagation, which allows easy integration in a neural network and flexibility to change the value of the $p$-norm. Our experiments on synthetic and real data show that using fractional $p$-norms for LDA leads to an improvement in accuracy compared to the traditional ${L}_2$-based LDA.
APA
Nagananda, N., Minnehan, B. & Savakis, A.. (2022). Robust $L_p$-Norm Linear Discriminant Analysis with Proxy Matrix Optimization. Proceedings of Topological, Algebraic, and Geometric Learning Workshops 2022, in Proceedings of Machine Learning Research 196:277-286 Available from https://proceedings.mlr.press/v196/nagananda22a.html.

Related Material