Optimal $k$-Discretization Learning

Tong Wang, Zhangyang Wang
Conference on Parsimony and Learning, PMLR 328:552-564, 2026.

Abstract

The $k$-discretization problem is known to be NP-hard in general. Existing algorithms exploit various heuristics and obtain at most local minima that are sensitive to initializations. This paper starts by discussing how to leverage polynomial-time optimal solvers for 1-D $k$-discretization which can serve as a powerful and parsimonious regularizer for complex learning tasks. The algorithm can be accelerated by sampling, with bounded approximation errors proven. The paper then presents an embedding learning approach to handle multi-dimensional $k$-discretization, based on the 1-D solution. Equipped with many novel task-specific modifications, the proposed approach achieves highly promising performance on a vast variety of application tasks, including signal quantization, image clustering, and image smoothening. Our codes are available at \url{https://github.com/VITA-Group/SnC}.

Cite this Paper


BibTeX
@InProceedings{pmlr-v328-wang26b, title = {Optimal $k$-Discretization Learning}, author = {Wang, Tong and Wang, Zhangyang}, booktitle = {Conference on Parsimony and Learning}, pages = {552--564}, year = {2026}, editor = {Burkholz, Rebekka and Liu, Shiwei and Ravishankar, Saiprasad and Redman, William and Huang, Wei and Su, Weijie and Zhu, Zhihui}, volume = {328}, series = {Proceedings of Machine Learning Research}, month = {23--26 Mar}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v328/main/assets/wang26b/wang26b.pdf}, url = {https://proceedings.mlr.press/v328/wang26b.html}, abstract = {The $k$-discretization problem is known to be NP-hard in general. Existing algorithms exploit various heuristics and obtain at most local minima that are sensitive to initializations. This paper starts by discussing how to leverage polynomial-time optimal solvers for 1-D $k$-discretization which can serve as a powerful and parsimonious regularizer for complex learning tasks. The algorithm can be accelerated by sampling, with bounded approximation errors proven. The paper then presents an embedding learning approach to handle multi-dimensional $k$-discretization, based on the 1-D solution. Equipped with many novel task-specific modifications, the proposed approach achieves highly promising performance on a vast variety of application tasks, including signal quantization, image clustering, and image smoothening. Our codes are available at \url{https://github.com/VITA-Group/SnC}.} }
Endnote
%0 Conference Paper %T Optimal $k$-Discretization Learning %A Tong Wang %A Zhangyang Wang %B Conference on Parsimony and Learning %C Proceedings of Machine Learning Research %D 2026 %E Rebekka Burkholz %E Shiwei Liu %E Saiprasad Ravishankar %E William Redman %E Wei Huang %E Weijie Su %E Zhihui Zhu %F pmlr-v328-wang26b %I PMLR %P 552--564 %U https://proceedings.mlr.press/v328/wang26b.html %V 328 %X The $k$-discretization problem is known to be NP-hard in general. Existing algorithms exploit various heuristics and obtain at most local minima that are sensitive to initializations. This paper starts by discussing how to leverage polynomial-time optimal solvers for 1-D $k$-discretization which can serve as a powerful and parsimonious regularizer for complex learning tasks. The algorithm can be accelerated by sampling, with bounded approximation errors proven. The paper then presents an embedding learning approach to handle multi-dimensional $k$-discretization, based on the 1-D solution. Equipped with many novel task-specific modifications, the proposed approach achieves highly promising performance on a vast variety of application tasks, including signal quantization, image clustering, and image smoothening. Our codes are available at \url{https://github.com/VITA-Group/SnC}.
APA
Wang, T. & Wang, Z.. (2026). Optimal $k$-Discretization Learning. Conference on Parsimony and Learning, in Proceedings of Machine Learning Research 328:552-564 Available from https://proceedings.mlr.press/v328/wang26b.html.

Related Material