Sampling from the Random Linear Model via Stochastic Localization Up to the AMP Threshold

Han Cui, Zhiyuan Yu, Jingbo Liu
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:4042-4050, 2025.

Abstract

Recently, Approximate Message Passing (AMP) has been integrated with stochastic localization (diffusion model) by providing a computationally efficient estimator of the posterior mean. Existing (rigorous) analysis typically proves the success of sampling for sufficiently small noise, but determining the exact threshold involves several challenges. In this paper, we focus on sampling from the posterior in the linear inverse problem, with an i.i.d. random design matrix, and show that the threshold for sampling coincide with that of posterior mean estimation. We give a proof for the convergence in smoothed KL divergence whenever the noise variance $\Delta$ is below $\Delta_{\rm AMP}$, which is the computation threshold for mean estimation introduced in (Barbier et al., 2020). We also show convergence in the Wasserstein distance under the same threshold assuming a dimension-free bound on the operator norm of the posterior covariance matrix, a condition strongly suggested by recent breakthroughs on operator norm bounds in similar replica symmetric systems. A key observation in our analysis is that phase transition does not occur along the sampling and interpolation paths assuming $\Delta<\Delta_{\rm AMP}$.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-cui25c, title = {Sampling from the Random Linear Model via Stochastic Localization Up to the AMP Threshold}, author = {Cui, Han and Yu, Zhiyuan and Liu, Jingbo}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {4042--4050}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/cui25c/cui25c.pdf}, url = {https://proceedings.mlr.press/v258/cui25c.html}, abstract = {Recently, Approximate Message Passing (AMP) has been integrated with stochastic localization (diffusion model) by providing a computationally efficient estimator of the posterior mean. Existing (rigorous) analysis typically proves the success of sampling for sufficiently small noise, but determining the exact threshold involves several challenges. In this paper, we focus on sampling from the posterior in the linear inverse problem, with an i.i.d. random design matrix, and show that the threshold for sampling coincide with that of posterior mean estimation. We give a proof for the convergence in smoothed KL divergence whenever the noise variance $\Delta$ is below $\Delta_{\rm AMP}$, which is the computation threshold for mean estimation introduced in (Barbier et al., 2020). We also show convergence in the Wasserstein distance under the same threshold assuming a dimension-free bound on the operator norm of the posterior covariance matrix, a condition strongly suggested by recent breakthroughs on operator norm bounds in similar replica symmetric systems. A key observation in our analysis is that phase transition does not occur along the sampling and interpolation paths assuming $\Delta<\Delta_{\rm AMP}$.} }
Endnote
%0 Conference Paper %T Sampling from the Random Linear Model via Stochastic Localization Up to the AMP Threshold %A Han Cui %A Zhiyuan Yu %A Jingbo Liu %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-cui25c %I PMLR %P 4042--4050 %U https://proceedings.mlr.press/v258/cui25c.html %V 258 %X Recently, Approximate Message Passing (AMP) has been integrated with stochastic localization (diffusion model) by providing a computationally efficient estimator of the posterior mean. Existing (rigorous) analysis typically proves the success of sampling for sufficiently small noise, but determining the exact threshold involves several challenges. In this paper, we focus on sampling from the posterior in the linear inverse problem, with an i.i.d. random design matrix, and show that the threshold for sampling coincide with that of posterior mean estimation. We give a proof for the convergence in smoothed KL divergence whenever the noise variance $\Delta$ is below $\Delta_{\rm AMP}$, which is the computation threshold for mean estimation introduced in (Barbier et al., 2020). We also show convergence in the Wasserstein distance under the same threshold assuming a dimension-free bound on the operator norm of the posterior covariance matrix, a condition strongly suggested by recent breakthroughs on operator norm bounds in similar replica symmetric systems. A key observation in our analysis is that phase transition does not occur along the sampling and interpolation paths assuming $\Delta<\Delta_{\rm AMP}$.
APA
Cui, H., Yu, Z. & Liu, J.. (2025). Sampling from the Random Linear Model via Stochastic Localization Up to the AMP Threshold. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:4042-4050 Available from https://proceedings.mlr.press/v258/cui25c.html.

Related Material