Density-preserving quantization with application to graph downsampling

Morteza Alamgir, Gábor Lugosi, Ulrike Luxburg
; Proceedings of The 27th Conference on Learning Theory, PMLR 35:543-559, 2014.

Abstract

We consider the problem of vector quantization of i.i.d. samples drawn from a density p on \mathbbR^d. It is desirable that the representatives selected by the quantization algorithm have the same distribution p as the original sample points. However, quantization algorithms based on Euclidean distance, such as k-means, do not have this property. We provide a solution to this problem that takes the unweighted k-nearest neighbor graph on the sample as input. In particular, it does not need to have access to the data points themselves. Our solution generates quantization centers that are “evenly spaced". We exploit this property to downsample geometric graphs and show that our method produces sparse downsampled graphs. Our algorithm is easy to implement, and we provide theoretical guarantees on the performance of the proposed algorithm.

Cite this Paper


BibTeX
@InProceedings{pmlr-v35-alamgir14, title = {Density-preserving quantization with application to graph downsampling}, author = {Morteza Alamgir and Gábor Lugosi and Ulrike Luxburg}, booktitle = {Proceedings of The 27th Conference on Learning Theory}, pages = {543--559}, year = {2014}, editor = {Maria Florina Balcan and Vitaly Feldman and Csaba Szepesvári}, volume = {35}, series = {Proceedings of Machine Learning Research}, address = {Barcelona, Spain}, month = {13--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v35/alamgir14.pdf}, url = {http://proceedings.mlr.press/v35/alamgir14.html}, abstract = {We consider the problem of vector quantization of i.i.d. samples drawn from a density p on \mathbbR^d. It is desirable that the representatives selected by the quantization algorithm have the same distribution p as the original sample points. However, quantization algorithms based on Euclidean distance, such as k-means, do not have this property. We provide a solution to this problem that takes the unweighted k-nearest neighbor graph on the sample as input. In particular, it does not need to have access to the data points themselves. Our solution generates quantization centers that are “evenly spaced". We exploit this property to downsample geometric graphs and show that our method produces sparse downsampled graphs. Our algorithm is easy to implement, and we provide theoretical guarantees on the performance of the proposed algorithm.} }
Endnote
%0 Conference Paper %T Density-preserving quantization with application to graph downsampling %A Morteza Alamgir %A Gábor Lugosi %A Ulrike Luxburg %B Proceedings of The 27th Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2014 %E Maria Florina Balcan %E Vitaly Feldman %E Csaba Szepesvári %F pmlr-v35-alamgir14 %I PMLR %J Proceedings of Machine Learning Research %P 543--559 %U http://proceedings.mlr.press %V 35 %W PMLR %X We consider the problem of vector quantization of i.i.d. samples drawn from a density p on \mathbbR^d. It is desirable that the representatives selected by the quantization algorithm have the same distribution p as the original sample points. However, quantization algorithms based on Euclidean distance, such as k-means, do not have this property. We provide a solution to this problem that takes the unweighted k-nearest neighbor graph on the sample as input. In particular, it does not need to have access to the data points themselves. Our solution generates quantization centers that are “evenly spaced". We exploit this property to downsample geometric graphs and show that our method produces sparse downsampled graphs. Our algorithm is easy to implement, and we provide theoretical guarantees on the performance of the proposed algorithm.
RIS
TY - CPAPER TI - Density-preserving quantization with application to graph downsampling AU - Morteza Alamgir AU - Gábor Lugosi AU - Ulrike Luxburg BT - Proceedings of The 27th Conference on Learning Theory PY - 2014/05/29 DA - 2014/05/29 ED - Maria Florina Balcan ED - Vitaly Feldman ED - Csaba Szepesvári ID - pmlr-v35-alamgir14 PB - PMLR SP - 543 DP - PMLR EP - 559 L1 - http://proceedings.mlr.press/v35/alamgir14.pdf UR - http://proceedings.mlr.press/v35/alamgir14.html AB - We consider the problem of vector quantization of i.i.d. samples drawn from a density p on \mathbbR^d. It is desirable that the representatives selected by the quantization algorithm have the same distribution p as the original sample points. However, quantization algorithms based on Euclidean distance, such as k-means, do not have this property. We provide a solution to this problem that takes the unweighted k-nearest neighbor graph on the sample as input. In particular, it does not need to have access to the data points themselves. Our solution generates quantization centers that are “evenly spaced". We exploit this property to downsample geometric graphs and show that our method produces sparse downsampled graphs. Our algorithm is easy to implement, and we provide theoretical guarantees on the performance of the proposed algorithm. ER -
APA
Alamgir, M., Lugosi, G. & Luxburg, U.. (2014). Density-preserving quantization with application to graph downsampling. Proceedings of The 27th Conference on Learning Theory, in PMLR 35:543-559

Related Material