On the Learnability of Distribution Classes with Adaptive Adversaries

Tosca Lechner, Alex Bie, Gautam Kamath
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:32853-32877, 2025.

Abstract

We consider the question of learnability of distribution classes in the presence of adaptive adversaries – that is, adversaries capable of intercepting the samples requested by a learner and applying manipulations with full knowledge of the samples before passing it on to the learner. This stands in contrast to oblivious adversaries, who can only modify the underlying distribution the samples come from but not their i.i.d. nature. We formulate a general notion of learnability with respect to adaptive adversaries, taking into account the budget of the adversary. We show that learnability with respect to additive adaptive adversaries is a strictly stronger condition than learnability with respect to additive oblivious adversaries.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-lechner25a, title = {On the Learnability of Distribution Classes with Adaptive Adversaries}, author = {Lechner, Tosca and Bie, Alex and Kamath, Gautam}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {32853--32877}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/lechner25a/lechner25a.pdf}, url = {https://proceedings.mlr.press/v267/lechner25a.html}, abstract = {We consider the question of learnability of distribution classes in the presence of adaptive adversaries – that is, adversaries capable of intercepting the samples requested by a learner and applying manipulations with full knowledge of the samples before passing it on to the learner. This stands in contrast to oblivious adversaries, who can only modify the underlying distribution the samples come from but not their i.i.d. nature. We formulate a general notion of learnability with respect to adaptive adversaries, taking into account the budget of the adversary. We show that learnability with respect to additive adaptive adversaries is a strictly stronger condition than learnability with respect to additive oblivious adversaries.} }
Endnote
%0 Conference Paper %T On the Learnability of Distribution Classes with Adaptive Adversaries %A Tosca Lechner %A Alex Bie %A Gautam Kamath %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-lechner25a %I PMLR %P 32853--32877 %U https://proceedings.mlr.press/v267/lechner25a.html %V 267 %X We consider the question of learnability of distribution classes in the presence of adaptive adversaries – that is, adversaries capable of intercepting the samples requested by a learner and applying manipulations with full knowledge of the samples before passing it on to the learner. This stands in contrast to oblivious adversaries, who can only modify the underlying distribution the samples come from but not their i.i.d. nature. We formulate a general notion of learnability with respect to adaptive adversaries, taking into account the budget of the adversary. We show that learnability with respect to additive adaptive adversaries is a strictly stronger condition than learnability with respect to additive oblivious adversaries.
APA
Lechner, T., Bie, A. & Kamath, G.. (2025). On the Learnability of Distribution Classes with Adaptive Adversaries. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:32853-32877 Available from https://proceedings.mlr.press/v267/lechner25a.html.

Related Material