Non-parametric Conditional Independence Testing for Mixed Continuous-Categorical Variables: A Novel Method and Numerical Evaluation

Oana-Iuliana Popescu, Andreas Gerhardus, Martin Rabel, Jakob Runge
Proceedings of the Fourth Conference on Causal Learning and Reasoning, PMLR 275:406-450, 2025.

Abstract

Conditional independence testing (CIT) is a common task in machine learning, e.g., for variable selection, and a main component of constraint-based causal discovery. While most current CIT approaches assume that all variables in a dataset are of the same type, either numerical or categorical, many real-world applications involve mixed-type datasets that include both numerical and categorical variables. Non-parametric CIT can be conducted using conditional mutual information (CMI) estimators combined with a local permutation scheme. Recently, two novel CMI estimators for mixed-type datasets based on k-nearest-neighbors (k-NN) have been proposed. As with any k-NN method, these estimators rely on the definition of a distance metric. One approach computes distances by a one-hot encoding of the categorical variables, essentially treating categorical variables as discrete-numerical, while the other expresses CMI by entropy terms where the categorical variables appear as conditions only. In this work, we study these estimators and propose a variation of the former approach that does not treat categorical variables as numeric. Our extensive numerical experiments show that our variant detects dependencies more robustly across different data distributions and preprocessing types.

Cite this Paper


BibTeX
@InProceedings{pmlr-v275-popescu25a, title = {Non-parametric Conditional Independence Testing for Mixed Continuous-Categorical Variables: A Novel Method and Numerical Evaluation}, author = {Popescu, Oana-Iuliana and Gerhardus, Andreas and Rabel, Martin and Runge, Jakob}, booktitle = {Proceedings of the Fourth Conference on Causal Learning and Reasoning}, pages = {406--450}, year = {2025}, editor = {Huang, Biwei and Drton, Mathias}, volume = {275}, series = {Proceedings of Machine Learning Research}, month = {07--09 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v275/main/assets/popescu25a/popescu25a.pdf}, url = {https://proceedings.mlr.press/v275/popescu25a.html}, abstract = {Conditional independence testing (CIT) is a common task in machine learning, e.g., for variable selection, and a main component of constraint-based causal discovery. While most current CIT approaches assume that all variables in a dataset are of the same type, either numerical or categorical, many real-world applications involve mixed-type datasets that include both numerical and categorical variables. Non-parametric CIT can be conducted using conditional mutual information (CMI) estimators combined with a local permutation scheme. Recently, two novel CMI estimators for mixed-type datasets based on k-nearest-neighbors (k-NN) have been proposed. As with any k-NN method, these estimators rely on the definition of a distance metric. One approach computes distances by a one-hot encoding of the categorical variables, essentially treating categorical variables as discrete-numerical, while the other expresses CMI by entropy terms where the categorical variables appear as conditions only. In this work, we study these estimators and propose a variation of the former approach that does not treat categorical variables as numeric. Our extensive numerical experiments show that our variant detects dependencies more robustly across different data distributions and preprocessing types.} }
Endnote
%0 Conference Paper %T Non-parametric Conditional Independence Testing for Mixed Continuous-Categorical Variables: A Novel Method and Numerical Evaluation %A Oana-Iuliana Popescu %A Andreas Gerhardus %A Martin Rabel %A Jakob Runge %B Proceedings of the Fourth Conference on Causal Learning and Reasoning %C Proceedings of Machine Learning Research %D 2025 %E Biwei Huang %E Mathias Drton %F pmlr-v275-popescu25a %I PMLR %P 406--450 %U https://proceedings.mlr.press/v275/popescu25a.html %V 275 %X Conditional independence testing (CIT) is a common task in machine learning, e.g., for variable selection, and a main component of constraint-based causal discovery. While most current CIT approaches assume that all variables in a dataset are of the same type, either numerical or categorical, many real-world applications involve mixed-type datasets that include both numerical and categorical variables. Non-parametric CIT can be conducted using conditional mutual information (CMI) estimators combined with a local permutation scheme. Recently, two novel CMI estimators for mixed-type datasets based on k-nearest-neighbors (k-NN) have been proposed. As with any k-NN method, these estimators rely on the definition of a distance metric. One approach computes distances by a one-hot encoding of the categorical variables, essentially treating categorical variables as discrete-numerical, while the other expresses CMI by entropy terms where the categorical variables appear as conditions only. In this work, we study these estimators and propose a variation of the former approach that does not treat categorical variables as numeric. Our extensive numerical experiments show that our variant detects dependencies more robustly across different data distributions and preprocessing types.
APA
Popescu, O., Gerhardus, A., Rabel, M. & Runge, J.. (2025). Non-parametric Conditional Independence Testing for Mixed Continuous-Categorical Variables: A Novel Method and Numerical Evaluation. Proceedings of the Fourth Conference on Causal Learning and Reasoning, in Proceedings of Machine Learning Research 275:406-450 Available from https://proceedings.mlr.press/v275/popescu25a.html.

Related Material