On cyclical MCMC sampling

Liwei Wang, Xinru Liu, Aaron Smith, Aguemon Y Atchade
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:3817-3825, 2024.

Abstract

Cyclical MCMC is a novel MCMC framework recently proposed by Zhang et al. (2019) to address the challenge posed by high- dimensional multimodal posterior distributions like those arising in deep learning. The algorithm works by generating a nonhomogeneous Markov chain that tracks —cyclically in time— tempered versions of the target distribution. We show in this work that cyclical MCMC converges to the desired probability distribution in settings where the Markov kernels used are fast mixing, and sufficiently long cycles are employed. However in the far more common settings of slow mixing kernels, the algorithm may fail to produce samples from the desired distribution. In particular, in a simple mixture example with unequal variance we show by simulation that cyclical MCMC fails to converge to the desired limit. Finally, we show that cyclical MCMC typically estimates well the local shape of the target distribution around each mode, even when we do not have convergence to the target.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-wang24j, title = { On cyclical {MCMC} sampling }, author = {Wang, Liwei and Liu, Xinru and Smith, Aaron and Y Atchade, Aguemon}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {3817--3825}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/wang24j/wang24j.pdf}, url = {https://proceedings.mlr.press/v238/wang24j.html}, abstract = { Cyclical MCMC is a novel MCMC framework recently proposed by Zhang et al. (2019) to address the challenge posed by high- dimensional multimodal posterior distributions like those arising in deep learning. The algorithm works by generating a nonhomogeneous Markov chain that tracks —cyclically in time— tempered versions of the target distribution. We show in this work that cyclical MCMC converges to the desired probability distribution in settings where the Markov kernels used are fast mixing, and sufficiently long cycles are employed. However in the far more common settings of slow mixing kernels, the algorithm may fail to produce samples from the desired distribution. In particular, in a simple mixture example with unequal variance we show by simulation that cyclical MCMC fails to converge to the desired limit. Finally, we show that cyclical MCMC typically estimates well the local shape of the target distribution around each mode, even when we do not have convergence to the target. } }
Endnote
%0 Conference Paper %T On cyclical MCMC sampling %A Liwei Wang %A Xinru Liu %A Aaron Smith %A Aguemon Y Atchade %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-wang24j %I PMLR %P 3817--3825 %U https://proceedings.mlr.press/v238/wang24j.html %V 238 %X Cyclical MCMC is a novel MCMC framework recently proposed by Zhang et al. (2019) to address the challenge posed by high- dimensional multimodal posterior distributions like those arising in deep learning. The algorithm works by generating a nonhomogeneous Markov chain that tracks —cyclically in time— tempered versions of the target distribution. We show in this work that cyclical MCMC converges to the desired probability distribution in settings where the Markov kernels used are fast mixing, and sufficiently long cycles are employed. However in the far more common settings of slow mixing kernels, the algorithm may fail to produce samples from the desired distribution. In particular, in a simple mixture example with unequal variance we show by simulation that cyclical MCMC fails to converge to the desired limit. Finally, we show that cyclical MCMC typically estimates well the local shape of the target distribution around each mode, even when we do not have convergence to the target.
APA
Wang, L., Liu, X., Smith, A. & Y Atchade, A.. (2024). On cyclical MCMC sampling . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:3817-3825 Available from https://proceedings.mlr.press/v238/wang24j.html.

Related Material