Statistical Learning under Heterogeneous Distribution Shift

Max Simchowitz, Anurag Ajay, Pulkit Agrawal, Akshay Krishnamurthy
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:31800-31851, 2023.

Abstract

This paper studies the prediction of a target $\mathbf{z}$ from a pair of random variables $(\mathbf{x},\mathbf{y})$, where the ground-truth predictor is additive $\mathbb{E}[\mathbf{z} \mid \mathbf{x},\mathbf{y}] = f_\star(\mathbf{x}) +g_{\star}(\mathbf{y})$. We study the performance of empirical risk minimization (ERM) over functions $f+g$, $f \in \mathcal{F}$ and $g \in \mathcal{G}$, fit on a given training distribution, but evaluated on a test distribution which exhibits covariate shift. We show that, when the class $\mathcal{F}$ is "simpler" than $\mathcal{G}$ (measured, e.g., in terms of its metric entropy), our predictor is more resilient to heterogeneous covariate shifts in which the shift in $\mathbf{x}$ is much greater than that in $\mathbf{y}$. These results rely on a novel Hölder style inequality for the Dudley integral which may be of independent interest. Moreover, we corroborate our theoretical findings with experiments demonstrating improved resilience to shifts in "simpler" features across numerous domains.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-simchowitz23a, title = {Statistical Learning under Heterogeneous Distribution Shift}, author = {Simchowitz, Max and Ajay, Anurag and Agrawal, Pulkit and Krishnamurthy, Akshay}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {31800--31851}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/simchowitz23a/simchowitz23a.pdf}, url = {https://proceedings.mlr.press/v202/simchowitz23a.html}, abstract = {This paper studies the prediction of a target $\mathbf{z}$ from a pair of random variables $(\mathbf{x},\mathbf{y})$, where the ground-truth predictor is additive $\mathbb{E}[\mathbf{z} \mid \mathbf{x},\mathbf{y}] = f_\star(\mathbf{x}) +g_{\star}(\mathbf{y})$. We study the performance of empirical risk minimization (ERM) over functions $f+g$, $f \in \mathcal{F}$ and $g \in \mathcal{G}$, fit on a given training distribution, but evaluated on a test distribution which exhibits covariate shift. We show that, when the class $\mathcal{F}$ is "simpler" than $\mathcal{G}$ (measured, e.g., in terms of its metric entropy), our predictor is more resilient to heterogeneous covariate shifts in which the shift in $\mathbf{x}$ is much greater than that in $\mathbf{y}$. These results rely on a novel Hölder style inequality for the Dudley integral which may be of independent interest. Moreover, we corroborate our theoretical findings with experiments demonstrating improved resilience to shifts in "simpler" features across numerous domains.} }
Endnote
%0 Conference Paper %T Statistical Learning under Heterogeneous Distribution Shift %A Max Simchowitz %A Anurag Ajay %A Pulkit Agrawal %A Akshay Krishnamurthy %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-simchowitz23a %I PMLR %P 31800--31851 %U https://proceedings.mlr.press/v202/simchowitz23a.html %V 202 %X This paper studies the prediction of a target $\mathbf{z}$ from a pair of random variables $(\mathbf{x},\mathbf{y})$, where the ground-truth predictor is additive $\mathbb{E}[\mathbf{z} \mid \mathbf{x},\mathbf{y}] = f_\star(\mathbf{x}) +g_{\star}(\mathbf{y})$. We study the performance of empirical risk minimization (ERM) over functions $f+g$, $f \in \mathcal{F}$ and $g \in \mathcal{G}$, fit on a given training distribution, but evaluated on a test distribution which exhibits covariate shift. We show that, when the class $\mathcal{F}$ is "simpler" than $\mathcal{G}$ (measured, e.g., in terms of its metric entropy), our predictor is more resilient to heterogeneous covariate shifts in which the shift in $\mathbf{x}$ is much greater than that in $\mathbf{y}$. These results rely on a novel Hölder style inequality for the Dudley integral which may be of independent interest. Moreover, we corroborate our theoretical findings with experiments demonstrating improved resilience to shifts in "simpler" features across numerous domains.
APA
Simchowitz, M., Ajay, A., Agrawal, P. & Krishnamurthy, A.. (2023). Statistical Learning under Heterogeneous Distribution Shift. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:31800-31851 Available from https://proceedings.mlr.press/v202/simchowitz23a.html.

Related Material