Efficient fair PCA for fair representation learning

Matthäus Kleindessner, Michele Donini, Chris Russell, Muhammad Bilal Zafar
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:5250-5270, 2023.

Abstract

We revisit the problem of fair principal component analysis (PCA), where the goal is to learn the best low-rank linear approximation of the data that obfuscates demographic information. We propose a conceptually simple approach that allows for an analytic solution similar to standard PCA and can be kernelized. Our methods have the same complexity as standard PCA, or kernel PCA, and run much faster than existing methods for fair PCA based on semidefinite programming or manifold optimization, while achieving similar results.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-kleindessner23a, title = {Efficient fair PCA for fair representation learning}, author = {Kleindessner, Matth\"aus and Donini, Michele and Russell, Chris and Zafar, Muhammad Bilal}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {5250--5270}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/kleindessner23a/kleindessner23a.pdf}, url = {https://proceedings.mlr.press/v206/kleindessner23a.html}, abstract = {We revisit the problem of fair principal component analysis (PCA), where the goal is to learn the best low-rank linear approximation of the data that obfuscates demographic information. We propose a conceptually simple approach that allows for an analytic solution similar to standard PCA and can be kernelized. Our methods have the same complexity as standard PCA, or kernel PCA, and run much faster than existing methods for fair PCA based on semidefinite programming or manifold optimization, while achieving similar results.} }
Endnote
%0 Conference Paper %T Efficient fair PCA for fair representation learning %A Matthäus Kleindessner %A Michele Donini %A Chris Russell %A Muhammad Bilal Zafar %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-kleindessner23a %I PMLR %P 5250--5270 %U https://proceedings.mlr.press/v206/kleindessner23a.html %V 206 %X We revisit the problem of fair principal component analysis (PCA), where the goal is to learn the best low-rank linear approximation of the data that obfuscates demographic information. We propose a conceptually simple approach that allows for an analytic solution similar to standard PCA and can be kernelized. Our methods have the same complexity as standard PCA, or kernel PCA, and run much faster than existing methods for fair PCA based on semidefinite programming or manifold optimization, while achieving similar results.
APA
Kleindessner, M., Donini, M., Russell, C. & Zafar, M.B.. (2023). Efficient fair PCA for fair representation learning. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:5250-5270 Available from https://proceedings.mlr.press/v206/kleindessner23a.html.

Related Material