A Stein identity for $q$-Gaussians with bounded support

Sophia Sklaviadis, Thomas Möllenhoff, Mario A. T. Figueiredo, Andre Martins, Mohammad Emtiyaz Khan
Conference on Parsimony and Learning, PMLR 328:921-939, 2026.

Abstract

Stein’s identity is a fundamental tool in machine learning with applications in generative models, stochastic optimization, and other problems involving gradients of expectations under Gaussian distributions. Less attention has been paid to problems with non-Gaussian expectations. Here, we consider the class of bounded-support $q$-Gaussians and derive a new Stein identity leading to gradient estimators which have nearly identical forms to the Gaussian ones, and which are similarly easy to implement. We do this by extending the previous results of Landsman, Vanduffel, and Yao (2013) to prove new Bonnet- and Price-type theorems for $q$-Gaussians. We also simplify their forms by using *escort* distributions. Our experiments show that bounded-support distributions can reduce the variance of gradient estimators, which can potentially be useful for Bayesian deep learning and sharpness-aware minimization. Overall, our work simplifies the application of Stein’s identity for an important class of non-Gaussian distributions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v328-sklaviadis26a, title = {A Stein identity for $q$-Gaussians with bounded support}, author = {Sklaviadis, Sophia and M\"{o}llenhoff, Thomas and Figueiredo, Mario A. T. and Martins, Andre and Khan, Mohammad Emtiyaz}, booktitle = {Conference on Parsimony and Learning}, pages = {921--939}, year = {2026}, editor = {Burkholz, Rebekka and Liu, Shiwei and Ravishankar, Saiprasad and Redman, William and Huang, Wei and Su, Weijie and Zhu, Zhihui}, volume = {328}, series = {Proceedings of Machine Learning Research}, month = {23--26 Mar}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v328/main/assets/sklaviadis26a/sklaviadis26a.pdf}, url = {https://proceedings.mlr.press/v328/sklaviadis26a.html}, abstract = {Stein’s identity is a fundamental tool in machine learning with applications in generative models, stochastic optimization, and other problems involving gradients of expectations under Gaussian distributions. Less attention has been paid to problems with non-Gaussian expectations. Here, we consider the class of bounded-support $q$-Gaussians and derive a new Stein identity leading to gradient estimators which have nearly identical forms to the Gaussian ones, and which are similarly easy to implement. We do this by extending the previous results of Landsman, Vanduffel, and Yao (2013) to prove new Bonnet- and Price-type theorems for $q$-Gaussians. We also simplify their forms by using *escort* distributions. Our experiments show that bounded-support distributions can reduce the variance of gradient estimators, which can potentially be useful for Bayesian deep learning and sharpness-aware minimization. Overall, our work simplifies the application of Stein’s identity for an important class of non-Gaussian distributions.} }
Endnote
%0 Conference Paper %T A Stein identity for $q$-Gaussians with bounded support %A Sophia Sklaviadis %A Thomas Möllenhoff %A Mario A. T. Figueiredo %A Andre Martins %A Mohammad Emtiyaz Khan %B Conference on Parsimony and Learning %C Proceedings of Machine Learning Research %D 2026 %E Rebekka Burkholz %E Shiwei Liu %E Saiprasad Ravishankar %E William Redman %E Wei Huang %E Weijie Su %E Zhihui Zhu %F pmlr-v328-sklaviadis26a %I PMLR %P 921--939 %U https://proceedings.mlr.press/v328/sklaviadis26a.html %V 328 %X Stein’s identity is a fundamental tool in machine learning with applications in generative models, stochastic optimization, and other problems involving gradients of expectations under Gaussian distributions. Less attention has been paid to problems with non-Gaussian expectations. Here, we consider the class of bounded-support $q$-Gaussians and derive a new Stein identity leading to gradient estimators which have nearly identical forms to the Gaussian ones, and which are similarly easy to implement. We do this by extending the previous results of Landsman, Vanduffel, and Yao (2013) to prove new Bonnet- and Price-type theorems for $q$-Gaussians. We also simplify their forms by using *escort* distributions. Our experiments show that bounded-support distributions can reduce the variance of gradient estimators, which can potentially be useful for Bayesian deep learning and sharpness-aware minimization. Overall, our work simplifies the application of Stein’s identity for an important class of non-Gaussian distributions.
APA
Sklaviadis, S., Möllenhoff, T., Figueiredo, M.A.T., Martins, A. & Khan, M.E.. (2026). A Stein identity for $q$-Gaussians with bounded support. Conference on Parsimony and Learning, in Proceedings of Machine Learning Research 328:921-939 Available from https://proceedings.mlr.press/v328/sklaviadis26a.html.

Related Material