[edit]
Orthogonal Statistical Learning with Self-Concordant Loss
Proceedings of Thirty Fifth Conference on Learning Theory, PMLR 178:5253-5277, 2022.
Abstract
Orthogonal statistical learning and double machine learning have emerged as general frameworks for two-stage statistical prediction in the presence of a nuisance component. We establish non-asymptotic bounds on the excess risk of orthogonal statistical learning methods with a loss function satisfying a self-concordance property. Our bounds improve upon existing bounds by a dimension factor while lifting the assumption of strong convexity. We illustrate the results with examples from multiple treatment effect estimation and generalized partially linear modeling.