Improving Generalization with Flat Hilbert Bayesian Inference

Tuan Truong, Quyen Tran, Ngoc-Quan Pham, Nhat Ho, Dinh Phung, Trung Le
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:60218-60237, 2025.

Abstract

We introduce Flat Hilbert Bayesian Inference (FHBI), an algorithm designed to enhance generalization in Bayesian inference. Our approach involves an iterative two-step procedure with an adversarial functional perturbation step and a functional descent step within the reproducing kernel Hilbert spaces. This methodology is supported by a theoretical analysis that extends previous findings on generalization ability from finite-dimensional Euclidean spaces to infinite-dimensional functional spaces. To evaluate the effectiveness of FHBI, we conduct comprehensive comparisons against nine baseline methods on the VTAB-1K benchmark, which encompasses 19 diverse datasets across various domains with diverse semantics. Empirical results demonstrate that FHBI consistently outperforms the baselines by notable margins, highlighting its practical efficacy.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-truong25b, title = {Improving Generalization with Flat {H}ilbert {B}ayesian Inference}, author = {Truong, Tuan and Tran, Quyen and Pham, Ngoc-Quan and Ho, Nhat and Phung, Dinh and Le, Trung}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {60218--60237}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/truong25b/truong25b.pdf}, url = {https://proceedings.mlr.press/v267/truong25b.html}, abstract = {We introduce Flat Hilbert Bayesian Inference (FHBI), an algorithm designed to enhance generalization in Bayesian inference. Our approach involves an iterative two-step procedure with an adversarial functional perturbation step and a functional descent step within the reproducing kernel Hilbert spaces. This methodology is supported by a theoretical analysis that extends previous findings on generalization ability from finite-dimensional Euclidean spaces to infinite-dimensional functional spaces. To evaluate the effectiveness of FHBI, we conduct comprehensive comparisons against nine baseline methods on the VTAB-1K benchmark, which encompasses 19 diverse datasets across various domains with diverse semantics. Empirical results demonstrate that FHBI consistently outperforms the baselines by notable margins, highlighting its practical efficacy.} }
Endnote
%0 Conference Paper %T Improving Generalization with Flat Hilbert Bayesian Inference %A Tuan Truong %A Quyen Tran %A Ngoc-Quan Pham %A Nhat Ho %A Dinh Phung %A Trung Le %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-truong25b %I PMLR %P 60218--60237 %U https://proceedings.mlr.press/v267/truong25b.html %V 267 %X We introduce Flat Hilbert Bayesian Inference (FHBI), an algorithm designed to enhance generalization in Bayesian inference. Our approach involves an iterative two-step procedure with an adversarial functional perturbation step and a functional descent step within the reproducing kernel Hilbert spaces. This methodology is supported by a theoretical analysis that extends previous findings on generalization ability from finite-dimensional Euclidean spaces to infinite-dimensional functional spaces. To evaluate the effectiveness of FHBI, we conduct comprehensive comparisons against nine baseline methods on the VTAB-1K benchmark, which encompasses 19 diverse datasets across various domains with diverse semantics. Empirical results demonstrate that FHBI consistently outperforms the baselines by notable margins, highlighting its practical efficacy.
APA
Truong, T., Tran, Q., Pham, N., Ho, N., Phung, D. & Le, T.. (2025). Improving Generalization with Flat Hilbert Bayesian Inference. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:60218-60237 Available from https://proceedings.mlr.press/v267/truong25b.html.

Related Material