Near-optimal max-affine estimators for convex regression

Gabor Balazs, András György, Csaba Szepesvari
Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, PMLR 38:56-64, 2015.

Abstract

This paper considers least squares estimators for regression problems over convex, uniformly bounded, uniformly Lipschitz function classes minimizing the empirical risk over max-affine functions (the maximum of finitely many affine functions). Based on new results on nonlinear nonparametric regression and on the approximation accuracy of max-affine functions, these estimators are proved to achieve the optimal rate of convergence up to logarithmic factors. Preliminary experiments indicate that a simple randomized approximation to the optimal estimator is competitive with state-of-the-art alternatives.

Cite this Paper


BibTeX
@InProceedings{pmlr-v38-balazs15, title = {{Near-optimal max-affine estimators for convex regression}}, author = {Gabor Balazs and András György and Csaba Szepesvari}, booktitle = {Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics}, pages = {56--64}, year = {2015}, editor = {Guy Lebanon and S. V. N. Vishwanathan}, volume = {38}, series = {Proceedings of Machine Learning Research}, address = {San Diego, California, USA}, month = {09--12 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v38/balazs15.pdf}, url = { http://proceedings.mlr.press/v38/balazs15.html }, abstract = {This paper considers least squares estimators for regression problems over convex, uniformly bounded, uniformly Lipschitz function classes minimizing the empirical risk over max-affine functions (the maximum of finitely many affine functions). Based on new results on nonlinear nonparametric regression and on the approximation accuracy of max-affine functions, these estimators are proved to achieve the optimal rate of convergence up to logarithmic factors. Preliminary experiments indicate that a simple randomized approximation to the optimal estimator is competitive with state-of-the-art alternatives.} }
Endnote
%0 Conference Paper %T Near-optimal max-affine estimators for convex regression %A Gabor Balazs %A András György %A Csaba Szepesvari %B Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2015 %E Guy Lebanon %E S. V. N. Vishwanathan %F pmlr-v38-balazs15 %I PMLR %P 56--64 %U http://proceedings.mlr.press/v38/balazs15.html %V 38 %X This paper considers least squares estimators for regression problems over convex, uniformly bounded, uniformly Lipschitz function classes minimizing the empirical risk over max-affine functions (the maximum of finitely many affine functions). Based on new results on nonlinear nonparametric regression and on the approximation accuracy of max-affine functions, these estimators are proved to achieve the optimal rate of convergence up to logarithmic factors. Preliminary experiments indicate that a simple randomized approximation to the optimal estimator is competitive with state-of-the-art alternatives.
RIS
TY - CPAPER TI - Near-optimal max-affine estimators for convex regression AU - Gabor Balazs AU - András György AU - Csaba Szepesvari BT - Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics DA - 2015/02/21 ED - Guy Lebanon ED - S. V. N. Vishwanathan ID - pmlr-v38-balazs15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 38 SP - 56 EP - 64 L1 - http://proceedings.mlr.press/v38/balazs15.pdf UR - http://proceedings.mlr.press/v38/balazs15.html AB - This paper considers least squares estimators for regression problems over convex, uniformly bounded, uniformly Lipschitz function classes minimizing the empirical risk over max-affine functions (the maximum of finitely many affine functions). Based on new results on nonlinear nonparametric regression and on the approximation accuracy of max-affine functions, these estimators are proved to achieve the optimal rate of convergence up to logarithmic factors. Preliminary experiments indicate that a simple randomized approximation to the optimal estimator is competitive with state-of-the-art alternatives. ER -
APA
Balazs, G., György, A. & Szepesvari, C.. (2015). Near-optimal max-affine estimators for convex regression. Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 38:56-64 Available from http://proceedings.mlr.press/v38/balazs15.html .

Related Material