Orthogonal Statistical Learning with Self-Concordant Loss

Lang Liu, Carlos Cinelli, Zaid Harchaoui
Proceedings of Thirty Fifth Conference on Learning Theory, PMLR 178:5253-5277, 2022.

Abstract

Orthogonal statistical learning and double machine learning have emerged as general frameworks for two-stage statistical prediction in the presence of a nuisance component. We establish non-asymptotic bounds on the excess risk of orthogonal statistical learning methods with a loss function satisfying a self-concordance property. Our bounds improve upon existing bounds by a dimension factor while lifting the assumption of strong convexity. We illustrate the results with examples from multiple treatment effect estimation and generalized partially linear modeling.

Cite this Paper


BibTeX
@InProceedings{pmlr-v178-liu22g, title = {Orthogonal Statistical Learning with Self-Concordant Loss}, author = {Liu, Lang and Cinelli, Carlos and Harchaoui, Zaid}, booktitle = {Proceedings of Thirty Fifth Conference on Learning Theory}, pages = {5253--5277}, year = {2022}, editor = {Loh, Po-Ling and Raginsky, Maxim}, volume = {178}, series = {Proceedings of Machine Learning Research}, month = {02--05 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v178/liu22g/liu22g.pdf}, url = {https://proceedings.mlr.press/v178/liu22g.html}, abstract = {Orthogonal statistical learning and double machine learning have emerged as general frameworks for two-stage statistical prediction in the presence of a nuisance component. We establish non-asymptotic bounds on the excess risk of orthogonal statistical learning methods with a loss function satisfying a self-concordance property. Our bounds improve upon existing bounds by a dimension factor while lifting the assumption of strong convexity. We illustrate the results with examples from multiple treatment effect estimation and generalized partially linear modeling.} }
Endnote
%0 Conference Paper %T Orthogonal Statistical Learning with Self-Concordant Loss %A Lang Liu %A Carlos Cinelli %A Zaid Harchaoui %B Proceedings of Thirty Fifth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2022 %E Po-Ling Loh %E Maxim Raginsky %F pmlr-v178-liu22g %I PMLR %P 5253--5277 %U https://proceedings.mlr.press/v178/liu22g.html %V 178 %X Orthogonal statistical learning and double machine learning have emerged as general frameworks for two-stage statistical prediction in the presence of a nuisance component. We establish non-asymptotic bounds on the excess risk of orthogonal statistical learning methods with a loss function satisfying a self-concordance property. Our bounds improve upon existing bounds by a dimension factor while lifting the assumption of strong convexity. We illustrate the results with examples from multiple treatment effect estimation and generalized partially linear modeling.
APA
Liu, L., Cinelli, C. & Harchaoui, Z.. (2022). Orthogonal Statistical Learning with Self-Concordant Loss. Proceedings of Thirty Fifth Conference on Learning Theory, in Proceedings of Machine Learning Research 178:5253-5277 Available from https://proceedings.mlr.press/v178/liu22g.html.

Related Material