Finite Smoothing Algorithm for High-Dimensional Support Vector Machines and Quantile Regression

Qian Tang, Yikai Zhang, Boxiang Wang
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:47865-47884, 2024.

Abstract

This paper introduces a finite smoothing algorithm (FSA), a novel approach to tackle computational challenges in applying support vector machines (SVM) and quantile regression to high-dimensional data. The critical issue with these methods is the non-smooth nature of their loss functions, which traditionally limits the use of highly efficient coordinate descent techniques in high-dimensional settings. FSA innovatively addresses this issue by transforming these loss functions into their smooth counterparts, thereby facilitating more efficient computation. A distinctive feature of FSA is its theoretical foundation: FSA can yield exact solutions, not just approximations, despite the smoothing approach. Our simulation and benchmark tests demonstrate that FSA significantly outpaces its competitors in speed, often by orders of magnitude, while improving or at least maintaining precision. We have implemented FSA in two open-source R packages: hdsvm for high-dimensional SVM and hdqr for high-dimensional quantile regression.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-tang24j, title = {Finite Smoothing Algorithm for High-Dimensional Support Vector Machines and Quantile Regression}, author = {Tang, Qian and Zhang, Yikai and Wang, Boxiang}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {47865--47884}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/tang24j/tang24j.pdf}, url = {https://proceedings.mlr.press/v235/tang24j.html}, abstract = {This paper introduces a finite smoothing algorithm (FSA), a novel approach to tackle computational challenges in applying support vector machines (SVM) and quantile regression to high-dimensional data. The critical issue with these methods is the non-smooth nature of their loss functions, which traditionally limits the use of highly efficient coordinate descent techniques in high-dimensional settings. FSA innovatively addresses this issue by transforming these loss functions into their smooth counterparts, thereby facilitating more efficient computation. A distinctive feature of FSA is its theoretical foundation: FSA can yield exact solutions, not just approximations, despite the smoothing approach. Our simulation and benchmark tests demonstrate that FSA significantly outpaces its competitors in speed, often by orders of magnitude, while improving or at least maintaining precision. We have implemented FSA in two open-source R packages: hdsvm for high-dimensional SVM and hdqr for high-dimensional quantile regression.} }
Endnote
%0 Conference Paper %T Finite Smoothing Algorithm for High-Dimensional Support Vector Machines and Quantile Regression %A Qian Tang %A Yikai Zhang %A Boxiang Wang %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-tang24j %I PMLR %P 47865--47884 %U https://proceedings.mlr.press/v235/tang24j.html %V 235 %X This paper introduces a finite smoothing algorithm (FSA), a novel approach to tackle computational challenges in applying support vector machines (SVM) and quantile regression to high-dimensional data. The critical issue with these methods is the non-smooth nature of their loss functions, which traditionally limits the use of highly efficient coordinate descent techniques in high-dimensional settings. FSA innovatively addresses this issue by transforming these loss functions into their smooth counterparts, thereby facilitating more efficient computation. A distinctive feature of FSA is its theoretical foundation: FSA can yield exact solutions, not just approximations, despite the smoothing approach. Our simulation and benchmark tests demonstrate that FSA significantly outpaces its competitors in speed, often by orders of magnitude, while improving or at least maintaining precision. We have implemented FSA in two open-source R packages: hdsvm for high-dimensional SVM and hdqr for high-dimensional quantile regression.
APA
Tang, Q., Zhang, Y. & Wang, B.. (2024). Finite Smoothing Algorithm for High-Dimensional Support Vector Machines and Quantile Regression. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:47865-47884 Available from https://proceedings.mlr.press/v235/tang24j.html.

Related Material