Deep Principal Support Vector Machines for Nonlinear Sufficient Dimension Reduction

Yinfeng Chen, Jin Liu, Rui Qiu
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:8574-8593, 2025.

Abstract

The normal vectors obtained from the support vector machine (SVM) method offer the potential to achieve sufficient dimension reduction in both classification and regression scenarios. Motivated by it, we in this paper introduce a unified framework for nonlinear sufficient dimension reduction based on classification ensemble. Kernel principal SVM, which leverages the reproducing kernel Hilbert space, can almost be regarded as a special case of this framework, and we generalize it by using a neural network function class for more flexible deep nonlinear reduction. We theoretically prove its unbiasedness with respect to the central $\sigma$-field and provide a nonasymptotic upper bound for the estimation error. Simulations and real data analysis demonstrate the considerable competitiveness of the proposed method, especially under heavy data contamination, large sample sizes, and complex inputs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-chen25ap, title = {Deep Principal Support Vector Machines for Nonlinear Sufficient Dimension Reduction}, author = {Chen, Yinfeng and Liu, Jin and Qiu, Rui}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {8574--8593}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/chen25ap/chen25ap.pdf}, url = {https://proceedings.mlr.press/v267/chen25ap.html}, abstract = {The normal vectors obtained from the support vector machine (SVM) method offer the potential to achieve sufficient dimension reduction in both classification and regression scenarios. Motivated by it, we in this paper introduce a unified framework for nonlinear sufficient dimension reduction based on classification ensemble. Kernel principal SVM, which leverages the reproducing kernel Hilbert space, can almost be regarded as a special case of this framework, and we generalize it by using a neural network function class for more flexible deep nonlinear reduction. We theoretically prove its unbiasedness with respect to the central $\sigma$-field and provide a nonasymptotic upper bound for the estimation error. Simulations and real data analysis demonstrate the considerable competitiveness of the proposed method, especially under heavy data contamination, large sample sizes, and complex inputs.} }
Endnote
%0 Conference Paper %T Deep Principal Support Vector Machines for Nonlinear Sufficient Dimension Reduction %A Yinfeng Chen %A Jin Liu %A Rui Qiu %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-chen25ap %I PMLR %P 8574--8593 %U https://proceedings.mlr.press/v267/chen25ap.html %V 267 %X The normal vectors obtained from the support vector machine (SVM) method offer the potential to achieve sufficient dimension reduction in both classification and regression scenarios. Motivated by it, we in this paper introduce a unified framework for nonlinear sufficient dimension reduction based on classification ensemble. Kernel principal SVM, which leverages the reproducing kernel Hilbert space, can almost be regarded as a special case of this framework, and we generalize it by using a neural network function class for more flexible deep nonlinear reduction. We theoretically prove its unbiasedness with respect to the central $\sigma$-field and provide a nonasymptotic upper bound for the estimation error. Simulations and real data analysis demonstrate the considerable competitiveness of the proposed method, especially under heavy data contamination, large sample sizes, and complex inputs.
APA
Chen, Y., Liu, J. & Qiu, R.. (2025). Deep Principal Support Vector Machines for Nonlinear Sufficient Dimension Reduction. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:8574-8593 Available from https://proceedings.mlr.press/v267/chen25ap.html.

Related Material