ConvexVST: A Convex Optimization Approach to Variance-stabilizing Transformation

Mengfan Wang, Boyu Lyu, Guoqiang Yu
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:10839-10848, 2021.

Abstract

The variance-stabilizing transformation (VST) problem is to transform heteroscedastic data to homoscedastic data so that they are more tractable for subsequent analysis. However, most of the existing approaches focus on finding an analytical solution for a certain parametric distribution, which severely limits the applications, because simple distributions cannot faithfully describe the real data while more complicated distributions cannot be analytically solved. In this paper, we converted the VST problem into a convex optimization problem, which can always be efficiently solved, identified the specific structure of the convex problem, which further improved the efficiency of the proposed algorithm, and showed that any finite discrete distributions and the discretized version of any continuous distributions from real data can be variance-stabilized in an easy and nonparametric way. We demonstrated the new approach on bioimaging data and achieved superior performance compared to peer algorithms in terms of not only the variance homoscedasticity but also the impact on subsequent analysis such as denoising. Source codes are available at https://github.com/yu-lab-vt/ConvexVST.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-wang21p, title = {ConvexVST: A Convex Optimization Approach to Variance-stabilizing Transformation}, author = {Wang, Mengfan and Lyu, Boyu and Yu, Guoqiang}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {10839--10848}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/wang21p/wang21p.pdf}, url = {https://proceedings.mlr.press/v139/wang21p.html}, abstract = {The variance-stabilizing transformation (VST) problem is to transform heteroscedastic data to homoscedastic data so that they are more tractable for subsequent analysis. However, most of the existing approaches focus on finding an analytical solution for a certain parametric distribution, which severely limits the applications, because simple distributions cannot faithfully describe the real data while more complicated distributions cannot be analytically solved. In this paper, we converted the VST problem into a convex optimization problem, which can always be efficiently solved, identified the specific structure of the convex problem, which further improved the efficiency of the proposed algorithm, and showed that any finite discrete distributions and the discretized version of any continuous distributions from real data can be variance-stabilized in an easy and nonparametric way. We demonstrated the new approach on bioimaging data and achieved superior performance compared to peer algorithms in terms of not only the variance homoscedasticity but also the impact on subsequent analysis such as denoising. Source codes are available at https://github.com/yu-lab-vt/ConvexVST.} }
Endnote
%0 Conference Paper %T ConvexVST: A Convex Optimization Approach to Variance-stabilizing Transformation %A Mengfan Wang %A Boyu Lyu %A Guoqiang Yu %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-wang21p %I PMLR %P 10839--10848 %U https://proceedings.mlr.press/v139/wang21p.html %V 139 %X The variance-stabilizing transformation (VST) problem is to transform heteroscedastic data to homoscedastic data so that they are more tractable for subsequent analysis. However, most of the existing approaches focus on finding an analytical solution for a certain parametric distribution, which severely limits the applications, because simple distributions cannot faithfully describe the real data while more complicated distributions cannot be analytically solved. In this paper, we converted the VST problem into a convex optimization problem, which can always be efficiently solved, identified the specific structure of the convex problem, which further improved the efficiency of the proposed algorithm, and showed that any finite discrete distributions and the discretized version of any continuous distributions from real data can be variance-stabilized in an easy and nonparametric way. We demonstrated the new approach on bioimaging data and achieved superior performance compared to peer algorithms in terms of not only the variance homoscedasticity but also the impact on subsequent analysis such as denoising. Source codes are available at https://github.com/yu-lab-vt/ConvexVST.
APA
Wang, M., Lyu, B. & Yu, G.. (2021). ConvexVST: A Convex Optimization Approach to Variance-stabilizing Transformation. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:10839-10848 Available from https://proceedings.mlr.press/v139/wang21p.html.

Related Material