Efficiently Access Diffusion Fisher: Within the Outer Product Span Space

Fangyikang Wang, Hubery Yin, Shaobin Zhuang, Huminhao Zhu, Yinan Li, Lei Qian, Chao Zhang, Hanbin Zhao, Hui Qian, Chen Li
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:65253-65277, 2025.

Abstract

Recent Diffusion models (DMs) advancements have explored incorporating the second-order diffusion Fisher information (DF), defined as the negative Hessian of log density, into various downstream tasks and theoretical analysis. However, current practices typically approximate the diffusion Fisher by applying auto-differentiation to the learned score network. This black-box method, though straightforward, lacks any accuracy guarantee and is time-consuming. In this paper, we show that the diffusion Fisher actually resides within a space spanned by the outer products of score and initial data. Based on the outer-product structure, we develop two efficient approximation algorithms to access the trace and matrix-vector multiplication of DF, respectively. These algorithms bypass the auto-differentiation operations with time-efficient vector-product calculations. Furthermore, we establish the approximation error bounds for the proposed algorithms. Experiments in likelihood evaluation and adjoint optimization demonstrate the superior accuracy and reduced computational cost of our proposed algorithms. Additionally, based on the novel outer-product formulation of DF, we design the first numerical verification experiment for the optimal transport property of the general PF-ODE deduced map.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-wang25ec, title = {Efficiently Access Diffusion Fisher: Within the Outer Product Span Space}, author = {Wang, Fangyikang and Yin, Hubery and Zhuang, Shaobin and Zhu, Huminhao and Li, Yinan and Qian, Lei and Zhang, Chao and Zhao, Hanbin and Qian, Hui and Li, Chen}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {65253--65277}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/wang25ec/wang25ec.pdf}, url = {https://proceedings.mlr.press/v267/wang25ec.html}, abstract = {Recent Diffusion models (DMs) advancements have explored incorporating the second-order diffusion Fisher information (DF), defined as the negative Hessian of log density, into various downstream tasks and theoretical analysis. However, current practices typically approximate the diffusion Fisher by applying auto-differentiation to the learned score network. This black-box method, though straightforward, lacks any accuracy guarantee and is time-consuming. In this paper, we show that the diffusion Fisher actually resides within a space spanned by the outer products of score and initial data. Based on the outer-product structure, we develop two efficient approximation algorithms to access the trace and matrix-vector multiplication of DF, respectively. These algorithms bypass the auto-differentiation operations with time-efficient vector-product calculations. Furthermore, we establish the approximation error bounds for the proposed algorithms. Experiments in likelihood evaluation and adjoint optimization demonstrate the superior accuracy and reduced computational cost of our proposed algorithms. Additionally, based on the novel outer-product formulation of DF, we design the first numerical verification experiment for the optimal transport property of the general PF-ODE deduced map.} }
Endnote
%0 Conference Paper %T Efficiently Access Diffusion Fisher: Within the Outer Product Span Space %A Fangyikang Wang %A Hubery Yin %A Shaobin Zhuang %A Huminhao Zhu %A Yinan Li %A Lei Qian %A Chao Zhang %A Hanbin Zhao %A Hui Qian %A Chen Li %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-wang25ec %I PMLR %P 65253--65277 %U https://proceedings.mlr.press/v267/wang25ec.html %V 267 %X Recent Diffusion models (DMs) advancements have explored incorporating the second-order diffusion Fisher information (DF), defined as the negative Hessian of log density, into various downstream tasks and theoretical analysis. However, current practices typically approximate the diffusion Fisher by applying auto-differentiation to the learned score network. This black-box method, though straightforward, lacks any accuracy guarantee and is time-consuming. In this paper, we show that the diffusion Fisher actually resides within a space spanned by the outer products of score and initial data. Based on the outer-product structure, we develop two efficient approximation algorithms to access the trace and matrix-vector multiplication of DF, respectively. These algorithms bypass the auto-differentiation operations with time-efficient vector-product calculations. Furthermore, we establish the approximation error bounds for the proposed algorithms. Experiments in likelihood evaluation and adjoint optimization demonstrate the superior accuracy and reduced computational cost of our proposed algorithms. Additionally, based on the novel outer-product formulation of DF, we design the first numerical verification experiment for the optimal transport property of the general PF-ODE deduced map.
APA
Wang, F., Yin, H., Zhuang, S., Zhu, H., Li, Y., Qian, L., Zhang, C., Zhao, H., Qian, H. & Li, C.. (2025). Efficiently Access Diffusion Fisher: Within the Outer Product Span Space. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:65253-65277 Available from https://proceedings.mlr.press/v267/wang25ec.html.

Related Material