[edit]
A Linear-time Independence Criterion Based on a Finite Basis Approximation
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:202-212, 2020.
Abstract
Detection of statistical dependence between random variables is an essential component in many machine learning algorithms. We propose a novel independence criterion for two random variables with linear-time complexity. We establish that our independence criterion is an upper bound of the Hirschfeld-Gebelein-Rényi maximum correlation coefficient between tested variables. A finite set of basis functions is employed to approximate the mapping functions that can achieve the maximal correlation. Using classic benchmark experiments based on independent component analysis, we demonstrate that our independence criterion performs comparably with the state-of-the-art quadratic-time kernel dependence measures like the Hilbert-Schmidt Independence Criterion, while being more efficient in computation. The experimental results also show that our independence criterion outperforms another contemporary linear-time kernel dependence measure, the Finite Set Independence Criterion. The potential application of our criterion in deep neural networks is validated experimentally.