[edit]
Enhancing Sufficient Dimension Reduction via Hellinger Correlation
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:18634-18647, 2024.
Abstract
In this work, we develop a new theory and method for sufficient dimension reduction (SDR) in single-index models, where SDR is a sub-field of supervised dimension reduction based on conditional independence. Our work is primarily motivated by the recent introduction of the Hellinger correlation as a dependency measure. Utilizing this measure, we have developed a method capable of effectively detecting the dimension reduction subspace, complete with theoretical justification. Through extensive numerical experiments, we demonstrate that our proposed method significantly enhances and outperforms existing SDR methods. This improvement is largely attributed to our proposed method’s deeper understanding of data dependencies and the refinement of existing SDR techniques.