Deep Q-Exponential Processes

Zhi Chang, Chukwudi Paul Obite, Shuang Zhou, Shiwei Lan
Proceedings of the 7th Symposium on Advances in Approximate Bayesian Inference, PMLR 289:1-24, 2025.

Abstract

Motivated by deep neural networks, the deep Gaussian process (DGP) generalizes the standard GP by stacking multiple layers of GPs. Despite the enhanced expressiveness, GP, as an $L_2$ regularization prior, tends to be over-smooth and sub-optimal for inhomogeneous objects, such as images with edges. Recently, Q-exponential process (Q-EP) has been proposed as an $L_q$ relaxation to GP and demonstrated with more desirable regularization properties through a parameter $q > 0$ with $q = 2$ corresponding to GP. Sharing the similar tractability of posterior and predictive distributions with GP, Q-EP can also be stacked to improve its modeling flexibility. In this paper, we generalize Q-EP to deep Q-EP to model inhomogeneous data with improved expressiveness. We introduce shallow Q-EP as a latent variable model and then build a hierarchy of the shallow Q-EP layers. Sparse approximation by inducing points and scalable variational strategy are applied to facilitate the inference. We demonstrate the numerical advantages of the proposed deep Q-EP model by comparing with multiple state-of-the-art deep probabilistic models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v289-chang25a, title = {Deep Q-Exponential Processes}, author = {Chang, Zhi and Obite, Chukwudi Paul and Zhou, Shuang and Lan, Shiwei}, booktitle = {Proceedings of the 7th Symposium on Advances in Approximate Bayesian Inference}, pages = {1--24}, year = {2025}, editor = {Allingham, James Urquhart and Swaroop, Siddharth}, volume = {289}, series = {Proceedings of Machine Learning Research}, month = {29 Apr}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v289/main/assets/chang25a/chang25a.pdf}, url = {https://proceedings.mlr.press/v289/chang25a.html}, abstract = {Motivated by deep neural networks, the deep Gaussian process (DGP) generalizes the standard GP by stacking multiple layers of GPs. Despite the enhanced expressiveness, GP, as an $L_2$ regularization prior, tends to be over-smooth and sub-optimal for inhomogeneous objects, such as images with edges. Recently, Q-exponential process (Q-EP) has been proposed as an $L_q$ relaxation to GP and demonstrated with more desirable regularization properties through a parameter $q > 0$ with $q = 2$ corresponding to GP. Sharing the similar tractability of posterior and predictive distributions with GP, Q-EP can also be stacked to improve its modeling flexibility. In this paper, we generalize Q-EP to deep Q-EP to model inhomogeneous data with improved expressiveness. We introduce shallow Q-EP as a latent variable model and then build a hierarchy of the shallow Q-EP layers. Sparse approximation by inducing points and scalable variational strategy are applied to facilitate the inference. We demonstrate the numerical advantages of the proposed deep Q-EP model by comparing with multiple state-of-the-art deep probabilistic models.} }
Endnote
%0 Conference Paper %T Deep Q-Exponential Processes %A Zhi Chang %A Chukwudi Paul Obite %A Shuang Zhou %A Shiwei Lan %B Proceedings of the 7th Symposium on Advances in Approximate Bayesian Inference %C Proceedings of Machine Learning Research %D 2025 %E James Urquhart Allingham %E Siddharth Swaroop %F pmlr-v289-chang25a %I PMLR %P 1--24 %U https://proceedings.mlr.press/v289/chang25a.html %V 289 %X Motivated by deep neural networks, the deep Gaussian process (DGP) generalizes the standard GP by stacking multiple layers of GPs. Despite the enhanced expressiveness, GP, as an $L_2$ regularization prior, tends to be over-smooth and sub-optimal for inhomogeneous objects, such as images with edges. Recently, Q-exponential process (Q-EP) has been proposed as an $L_q$ relaxation to GP and demonstrated with more desirable regularization properties through a parameter $q > 0$ with $q = 2$ corresponding to GP. Sharing the similar tractability of posterior and predictive distributions with GP, Q-EP can also be stacked to improve its modeling flexibility. In this paper, we generalize Q-EP to deep Q-EP to model inhomogeneous data with improved expressiveness. We introduce shallow Q-EP as a latent variable model and then build a hierarchy of the shallow Q-EP layers. Sparse approximation by inducing points and scalable variational strategy are applied to facilitate the inference. We demonstrate the numerical advantages of the proposed deep Q-EP model by comparing with multiple state-of-the-art deep probabilistic models.
APA
Chang, Z., Obite, C.P., Zhou, S. & Lan, S.. (2025). Deep Q-Exponential Processes. Proceedings of the 7th Symposium on Advances in Approximate Bayesian Inference, in Proceedings of Machine Learning Research 289:1-24 Available from https://proceedings.mlr.press/v289/chang25a.html.

Related Material