Optimal Training of Fair Predictive Models

Razieh Nabi, Daniel Malinsky, Ilya Shpitser
Proceedings of the First Conference on Causal Learning and Reasoning, PMLR 177:594-617, 2022.

Abstract

Recently there has been sustained interest in modifying prediction algorithms to satisfy fairness constraints. These constraints are typically complex nonlinear functionals of the observed data distribution. Focusing on the path-specific causal constraints, we introduce new theoretical results and optimization techniques to make model training easier and more accurate. Specifically, we show how to reparameterize the observed data likelihood such that fairness constraints correspond directly to parameters that appear in the likelihood, transforming a complex constrained optimization objective into a simple optimization problem with box constraints. We also exploit methods from empirical likelihood theory in statistics to improve predictive performance by constraining baseline covariates, without requiring parametric models. We combine the merits of both proposals to optimize a hybrid reparameterized likelihood. The techniques presented here should be applicable more broadly to fair prediction proposals that impose constraints on predictive models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v177-nabi22a, title = {Optimal Training of Fair Predictive Models}, author = {Nabi, Razieh and Malinsky, Daniel and Shpitser, Ilya}, booktitle = {Proceedings of the First Conference on Causal Learning and Reasoning}, pages = {594--617}, year = {2022}, editor = {Schölkopf, Bernhard and Uhler, Caroline and Zhang, Kun}, volume = {177}, series = {Proceedings of Machine Learning Research}, month = {11--13 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v177/nabi22a/nabi22a.pdf}, url = {https://proceedings.mlr.press/v177/nabi22a.html}, abstract = {Recently there has been sustained interest in modifying prediction algorithms to satisfy fairness constraints. These constraints are typically complex nonlinear functionals of the observed data distribution. Focusing on the path-specific causal constraints, we introduce new theoretical results and optimization techniques to make model training easier and more accurate. Specifically, we show how to reparameterize the observed data likelihood such that fairness constraints correspond directly to parameters that appear in the likelihood, transforming a complex constrained optimization objective into a simple optimization problem with box constraints. We also exploit methods from empirical likelihood theory in statistics to improve predictive performance by constraining baseline covariates, without requiring parametric models. We combine the merits of both proposals to optimize a hybrid reparameterized likelihood. The techniques presented here should be applicable more broadly to fair prediction proposals that impose constraints on predictive models. } }
Endnote
%0 Conference Paper %T Optimal Training of Fair Predictive Models %A Razieh Nabi %A Daniel Malinsky %A Ilya Shpitser %B Proceedings of the First Conference on Causal Learning and Reasoning %C Proceedings of Machine Learning Research %D 2022 %E Bernhard Schölkopf %E Caroline Uhler %E Kun Zhang %F pmlr-v177-nabi22a %I PMLR %P 594--617 %U https://proceedings.mlr.press/v177/nabi22a.html %V 177 %X Recently there has been sustained interest in modifying prediction algorithms to satisfy fairness constraints. These constraints are typically complex nonlinear functionals of the observed data distribution. Focusing on the path-specific causal constraints, we introduce new theoretical results and optimization techniques to make model training easier and more accurate. Specifically, we show how to reparameterize the observed data likelihood such that fairness constraints correspond directly to parameters that appear in the likelihood, transforming a complex constrained optimization objective into a simple optimization problem with box constraints. We also exploit methods from empirical likelihood theory in statistics to improve predictive performance by constraining baseline covariates, without requiring parametric models. We combine the merits of both proposals to optimize a hybrid reparameterized likelihood. The techniques presented here should be applicable more broadly to fair prediction proposals that impose constraints on predictive models.
APA
Nabi, R., Malinsky, D. & Shpitser, I.. (2022). Optimal Training of Fair Predictive Models. Proceedings of the First Conference on Causal Learning and Reasoning, in Proceedings of Machine Learning Research 177:594-617 Available from https://proceedings.mlr.press/v177/nabi22a.html.

Related Material