The Kernel Interaction Trick: Fast Bayesian Discovery of Pairwise Interactions in High Dimensions

Raj Agrawal, Brian Trippe, Jonathan Huggins, Tamara Broderick
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:141-150, 2019.

Abstract

Discovering interaction effects on a response of interest is a fundamental problem faced in biology, medicine, economics, and many other scientific disciplines. In theory, Bayesian methods for discovering pairwise interactions enjoy many benefits such as coherent uncertainty quantification, the ability to incorporate background knowledge, and desirable shrinkage properties. In practice, however, Bayesian methods are often computationally intractable for even moderate- dimensional problems. Our key insight is that many hierarchical models of practical interest admit a Gaussian process representation such that rather than maintaining a posterior over all O(p^2) interactions, we need only maintain a vector of O(p) kernel hyper-parameters. This implicit representation allows us to run Markov chain Monte Carlo (MCMC) over model hyper-parameters in time and memory linear in p per iteration. We focus on sparsity-inducing models and show on datasets with a variety of covariate behaviors that our method: (1) reduces runtime by orders of magnitude over naive applications of MCMC, (2) provides lower Type I and Type II error relative to state-of-the-art LASSO-based approaches, and (3) offers improved computational scaling in high dimensions relative to existing Bayesian and LASSO-based approaches.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-agrawal19a, title = {The Kernel Interaction Trick: Fast {B}ayesian Discovery of Pairwise Interactions in High Dimensions}, author = {Agrawal, Raj and Trippe, Brian and Huggins, Jonathan and Broderick, Tamara}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {141--150}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/agrawal19a/agrawal19a.pdf}, url = {https://proceedings.mlr.press/v97/agrawal19a.html}, abstract = {Discovering interaction effects on a response of interest is a fundamental problem faced in biology, medicine, economics, and many other scientific disciplines. In theory, Bayesian methods for discovering pairwise interactions enjoy many benefits such as coherent uncertainty quantification, the ability to incorporate background knowledge, and desirable shrinkage properties. In practice, however, Bayesian methods are often computationally intractable for even moderate- dimensional problems. Our key insight is that many hierarchical models of practical interest admit a Gaussian process representation such that rather than maintaining a posterior over all O(p^2) interactions, we need only maintain a vector of O(p) kernel hyper-parameters. This implicit representation allows us to run Markov chain Monte Carlo (MCMC) over model hyper-parameters in time and memory linear in p per iteration. We focus on sparsity-inducing models and show on datasets with a variety of covariate behaviors that our method: (1) reduces runtime by orders of magnitude over naive applications of MCMC, (2) provides lower Type I and Type II error relative to state-of-the-art LASSO-based approaches, and (3) offers improved computational scaling in high dimensions relative to existing Bayesian and LASSO-based approaches.} }
Endnote
%0 Conference Paper %T The Kernel Interaction Trick: Fast Bayesian Discovery of Pairwise Interactions in High Dimensions %A Raj Agrawal %A Brian Trippe %A Jonathan Huggins %A Tamara Broderick %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-agrawal19a %I PMLR %P 141--150 %U https://proceedings.mlr.press/v97/agrawal19a.html %V 97 %X Discovering interaction effects on a response of interest is a fundamental problem faced in biology, medicine, economics, and many other scientific disciplines. In theory, Bayesian methods for discovering pairwise interactions enjoy many benefits such as coherent uncertainty quantification, the ability to incorporate background knowledge, and desirable shrinkage properties. In practice, however, Bayesian methods are often computationally intractable for even moderate- dimensional problems. Our key insight is that many hierarchical models of practical interest admit a Gaussian process representation such that rather than maintaining a posterior over all O(p^2) interactions, we need only maintain a vector of O(p) kernel hyper-parameters. This implicit representation allows us to run Markov chain Monte Carlo (MCMC) over model hyper-parameters in time and memory linear in p per iteration. We focus on sparsity-inducing models and show on datasets with a variety of covariate behaviors that our method: (1) reduces runtime by orders of magnitude over naive applications of MCMC, (2) provides lower Type I and Type II error relative to state-of-the-art LASSO-based approaches, and (3) offers improved computational scaling in high dimensions relative to existing Bayesian and LASSO-based approaches.
APA
Agrawal, R., Trippe, B., Huggins, J. & Broderick, T.. (2019). The Kernel Interaction Trick: Fast Bayesian Discovery of Pairwise Interactions in High Dimensions. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:141-150 Available from https://proceedings.mlr.press/v97/agrawal19a.html.

Related Material