(Approximate) Subgradient Methods for Structured Prediction

Nathan D. Ratliff, J. Andrew Bagnell, Martin A. Zinkevich
Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, PMLR 2:380-387, 2007.

Abstract

Promising approaches to structured learning problems have recently been developed in the maximum margin framework. Unfortunately, algorithms that are computationally and memory efficient enough to solve large scale problems have lagged behind. We propose using simple subgradient-based techniques for optimizing a regularized risk formulation of these problems in both online and batch settings, and analyze the theoretical convergence, generalization, and robustness properties of the resulting techniques. These algorithms are are simple, memory efficient, fast to converge, and have small regret in the online setting. We also investigate a novel convex regression formulation of structured learning. Finally, we demonstrate the benefits of the subgradient approach on three structured prediction problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v2-ratliff07a, title = {(Approximate) Subgradient Methods for Structured Prediction}, author = {Ratliff, Nathan D. and Bagnell, J. Andrew and Zinkevich, Martin A.}, booktitle = {Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics}, pages = {380--387}, year = {2007}, editor = {Meila, Marina and Shen, Xiaotong}, volume = {2}, series = {Proceedings of Machine Learning Research}, address = {San Juan, Puerto Rico}, month = {21--24 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v2/ratliff07a/ratliff07a.pdf}, url = {http://proceedings.mlr.press/v2/ratliff07a.html}, abstract = {Promising approaches to structured learning problems have recently been developed in the maximum margin framework. Unfortunately, algorithms that are computationally and memory efficient enough to solve large scale problems have lagged behind. We propose using simple subgradient-based techniques for optimizing a regularized risk formulation of these problems in both online and batch settings, and analyze the theoretical convergence, generalization, and robustness properties of the resulting techniques. These algorithms are are simple, memory efficient, fast to converge, and have small regret in the online setting. We also investigate a novel convex regression formulation of structured learning. Finally, we demonstrate the benefits of the subgradient approach on three structured prediction problems.} }
Endnote
%0 Conference Paper %T (Approximate) Subgradient Methods for Structured Prediction %A Nathan D. Ratliff %A J. Andrew Bagnell %A Martin A. Zinkevich %B Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2007 %E Marina Meila %E Xiaotong Shen %F pmlr-v2-ratliff07a %I PMLR %P 380--387 %U http://proceedings.mlr.press/v2/ratliff07a.html %V 2 %X Promising approaches to structured learning problems have recently been developed in the maximum margin framework. Unfortunately, algorithms that are computationally and memory efficient enough to solve large scale problems have lagged behind. We propose using simple subgradient-based techniques for optimizing a regularized risk formulation of these problems in both online and batch settings, and analyze the theoretical convergence, generalization, and robustness properties of the resulting techniques. These algorithms are are simple, memory efficient, fast to converge, and have small regret in the online setting. We also investigate a novel convex regression formulation of structured learning. Finally, we demonstrate the benefits of the subgradient approach on three structured prediction problems.
RIS
TY - CPAPER TI - (Approximate) Subgradient Methods for Structured Prediction AU - Nathan D. Ratliff AU - J. Andrew Bagnell AU - Martin A. Zinkevich BT - Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics DA - 2007/03/11 ED - Marina Meila ED - Xiaotong Shen ID - pmlr-v2-ratliff07a PB - PMLR DP - Proceedings of Machine Learning Research VL - 2 SP - 380 EP - 387 L1 - http://proceedings.mlr.press/v2/ratliff07a/ratliff07a.pdf UR - http://proceedings.mlr.press/v2/ratliff07a.html AB - Promising approaches to structured learning problems have recently been developed in the maximum margin framework. Unfortunately, algorithms that are computationally and memory efficient enough to solve large scale problems have lagged behind. We propose using simple subgradient-based techniques for optimizing a regularized risk formulation of these problems in both online and batch settings, and analyze the theoretical convergence, generalization, and robustness properties of the resulting techniques. These algorithms are are simple, memory efficient, fast to converge, and have small regret in the online setting. We also investigate a novel convex regression formulation of structured learning. Finally, we demonstrate the benefits of the subgradient approach on three structured prediction problems. ER -
APA
Ratliff, N.D., Bagnell, J.A. & Zinkevich, M.A.. (2007). (Approximate) Subgradient Methods for Structured Prediction. Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 2:380-387 Available from http://proceedings.mlr.press/v2/ratliff07a.html.

Related Material