Influence Diagnostics under Self-concordance

Jillian Fisher, Lang Liu, Krishna Pillutla, Yejin Choi, Zaid Harchaoui
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:10028-10076, 2023.

Abstract

Influence diagnostics such as influence functions and approximate maximum influence perturbations are popular in machine learning and in AI domain applications. Influence diagnostics are powerful statistical tools to identify influential datapoints or subsets of datapoints. We establish finite-sample statistical bounds, as well as computational complexity bounds, for influence functions and approximate maximum influence perturbations using efficient inverse-Hessian-vector product implementations. We illustrate our results with generalized linear models and large attention based models on synthetic and real data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-fisher23a, title = {Influence Diagnostics under Self-concordance}, author = {Fisher, Jillian and Liu, Lang and Pillutla, Krishna and Choi, Yejin and Harchaoui, Zaid}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {10028--10076}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/fisher23a/fisher23a.pdf}, url = {https://proceedings.mlr.press/v206/fisher23a.html}, abstract = {Influence diagnostics such as influence functions and approximate maximum influence perturbations are popular in machine learning and in AI domain applications. Influence diagnostics are powerful statistical tools to identify influential datapoints or subsets of datapoints. We establish finite-sample statistical bounds, as well as computational complexity bounds, for influence functions and approximate maximum influence perturbations using efficient inverse-Hessian-vector product implementations. We illustrate our results with generalized linear models and large attention based models on synthetic and real data.} }
Endnote
%0 Conference Paper %T Influence Diagnostics under Self-concordance %A Jillian Fisher %A Lang Liu %A Krishna Pillutla %A Yejin Choi %A Zaid Harchaoui %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-fisher23a %I PMLR %P 10028--10076 %U https://proceedings.mlr.press/v206/fisher23a.html %V 206 %X Influence diagnostics such as influence functions and approximate maximum influence perturbations are popular in machine learning and in AI domain applications. Influence diagnostics are powerful statistical tools to identify influential datapoints or subsets of datapoints. We establish finite-sample statistical bounds, as well as computational complexity bounds, for influence functions and approximate maximum influence perturbations using efficient inverse-Hessian-vector product implementations. We illustrate our results with generalized linear models and large attention based models on synthetic and real data.
APA
Fisher, J., Liu, L., Pillutla, K., Choi, Y. & Harchaoui, Z.. (2023). Influence Diagnostics under Self-concordance. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:10028-10076 Available from https://proceedings.mlr.press/v206/fisher23a.html.

Related Material