Towards Robust Influence Functions with Flat Validation Minima

Xichen Ye, Yifan Wu, Weizhong Zhang, Cheng Jin, Yifan Chen
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:72091-72111, 2025.

Abstract

The Influence Function (IF) is a widely used technique for assessing the impact of individual training samples on model predictions. However, existing IF methods often fail to provide reliable influence estimates in deep neural networks, particularly when applied to noisy training data. This issue does not stem from inaccuracies in parameter change estimation, which has been the primary focus of prior research, but rather from deficiencies in loss change estimation, specifically due to the sharpness of validation risk. In this work, we establish a theoretical connection between influence estimation error, validation set risk, and its sharpness, underscoring the importance of flat validation minima for accurate influence estimation. Furthermore, we introduce a novel estimation form of Influence Function specifically designed for flat validation minima. Experimental results across various tasks validate the superiority of our approach.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-ye25h, title = {Towards Robust Influence Functions with Flat Validation Minima}, author = {Ye, Xichen and Wu, Yifan and Zhang, Weizhong and Jin, Cheng and Chen, Yifan}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {72091--72111}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/ye25h/ye25h.pdf}, url = {https://proceedings.mlr.press/v267/ye25h.html}, abstract = {The Influence Function (IF) is a widely used technique for assessing the impact of individual training samples on model predictions. However, existing IF methods often fail to provide reliable influence estimates in deep neural networks, particularly when applied to noisy training data. This issue does not stem from inaccuracies in parameter change estimation, which has been the primary focus of prior research, but rather from deficiencies in loss change estimation, specifically due to the sharpness of validation risk. In this work, we establish a theoretical connection between influence estimation error, validation set risk, and its sharpness, underscoring the importance of flat validation minima for accurate influence estimation. Furthermore, we introduce a novel estimation form of Influence Function specifically designed for flat validation minima. Experimental results across various tasks validate the superiority of our approach.} }
Endnote
%0 Conference Paper %T Towards Robust Influence Functions with Flat Validation Minima %A Xichen Ye %A Yifan Wu %A Weizhong Zhang %A Cheng Jin %A Yifan Chen %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-ye25h %I PMLR %P 72091--72111 %U https://proceedings.mlr.press/v267/ye25h.html %V 267 %X The Influence Function (IF) is a widely used technique for assessing the impact of individual training samples on model predictions. However, existing IF methods often fail to provide reliable influence estimates in deep neural networks, particularly when applied to noisy training data. This issue does not stem from inaccuracies in parameter change estimation, which has been the primary focus of prior research, but rather from deficiencies in loss change estimation, specifically due to the sharpness of validation risk. In this work, we establish a theoretical connection between influence estimation error, validation set risk, and its sharpness, underscoring the importance of flat validation minima for accurate influence estimation. Furthermore, we introduce a novel estimation form of Influence Function specifically designed for flat validation minima. Experimental results across various tasks validate the superiority of our approach.
APA
Ye, X., Wu, Y., Zhang, W., Jin, C. & Chen, Y.. (2025). Towards Robust Influence Functions with Flat Validation Minima. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:72091-72111 Available from https://proceedings.mlr.press/v267/ye25h.html.

Related Material