Scaling Causal Inference in Additive Noise Models

Charles Karim Assaad, Emilie Devijver, Eric Gaussier, Ali Ait-Bachir
Proceedings of Machine Learning Research, PMLR 104:22-33, 2019.

Abstract

The discovery of causal relationships from observations is a fundamental and difficult problem. We address it in the context of Additive Noise Models, and show, through both consistency analysis and experiments, that the state-of-art causal inference procedure on such models can be made simpler and faster, without loss of performance. Indeed, the method we propose uses one regressor instead of two in the bivariate case and 2(d − 1) regressors instead of (d^2 − 1) in the multivariate case with d random variables. In addition, we show how one can, from the regressors we use, accelerate the computation of the Hilbert-Schmidt Independence Criterion, a standard independence measure used in several causal inference procedures.

Cite this Paper


BibTeX
@InProceedings{pmlr-v104-assaad19a, title = {Scaling Causal Inference in Additive Noise Models}, author = {Assaad, Charles Karim and Devijver, Emilie and Gaussier, Eric and Ait-Bachir, Ali}, booktitle = {Proceedings of Machine Learning Research}, pages = {22--33}, year = {2019}, editor = {}, volume = {104}, series = {Proceedings of Machine Learning Research}, month = {05 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v104/assaad19a/assaad19a.pdf}, url = {https://proceedings.mlr.press/v104/assaad19a.html}, abstract = {The discovery of causal relationships from observations is a fundamental and difficult problem. We address it in the context of Additive Noise Models, and show, through both consistency analysis and experiments, that the state-of-art causal inference procedure on such models can be made simpler and faster, without loss of performance. Indeed, the method we propose uses one regressor instead of two in the bivariate case and 2(d − 1) regressors instead of (d^2 − 1) in the multivariate case with d random variables. In addition, we show how one can, from the regressors we use, accelerate the computation of the Hilbert-Schmidt Independence Criterion, a standard independence measure used in several causal inference procedures.} }
Endnote
%0 Conference Paper %T Scaling Causal Inference in Additive Noise Models %A Charles Karim Assaad %A Emilie Devijver %A Eric Gaussier %A Ali Ait-Bachir %B Proceedings of Machine Learning Research %C Proceedings of Machine Learning Research %D 2019 %E %F pmlr-v104-assaad19a %I PMLR %P 22--33 %U https://proceedings.mlr.press/v104/assaad19a.html %V 104 %X The discovery of causal relationships from observations is a fundamental and difficult problem. We address it in the context of Additive Noise Models, and show, through both consistency analysis and experiments, that the state-of-art causal inference procedure on such models can be made simpler and faster, without loss of performance. Indeed, the method we propose uses one regressor instead of two in the bivariate case and 2(d − 1) regressors instead of (d^2 − 1) in the multivariate case with d random variables. In addition, we show how one can, from the regressors we use, accelerate the computation of the Hilbert-Schmidt Independence Criterion, a standard independence measure used in several causal inference procedures.
APA
Assaad, C.K., Devijver, E., Gaussier, E. & Ait-Bachir, A.. (2019). Scaling Causal Inference in Additive Noise Models. Proceedings of Machine Learning Research, in Proceedings of Machine Learning Research 104:22-33 Available from https://proceedings.mlr.press/v104/assaad19a.html.

Related Material