Mark Schmidt,
Ewout Berg,
Michael Friedlander,
Kevin Murphy
;
Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:456-463, 2009.
Abstract
An optimization algorithm for minimizing a smooth function over a convex set is described. Each iteration of the method computes a descent direction by minimizing, over the original constraints, a diagonal plus low-rank quadratic approximation to the function. The quadratic approximation is constructed using a limited-memory quasi-Newton update. The method is suitable for large-scale problems where evaluation of the function is substantially more expensive than projection onto the constraint set. Numerical experiments on one-norm regularized test problems indicate that the proposed method is competitive with state-of-the-art methods such as bound-constrained L-BFGS and orthant-wise descent. We further show that the method generalizes to a wide class of problems, and substantially improves on state-of-the-art methods for problems such as learning the structure of Gaussian graphical models and Markov random fields.
@InProceedings{pmlr-v5-schmidt09a,
title = {Optimizing Costly Functions with Simple Constraints: A Limited-Memory Projected Quasi-Newton Algorithm},
author = {Mark Schmidt and Ewout Berg and Michael Friedlander and Kevin Murphy},
booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics},
pages = {456--463},
year = {2009},
editor = {David van Dyk and Max Welling},
volume = {5},
series = {Proceedings of Machine Learning Research},
address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA},
month = {16--18 Apr},
publisher = {PMLR},
pdf = {http://proceedings.mlr.press/v5/schmidt09a/schmidt09a.pdf},
url = {http://proceedings.mlr.press/v5/schmidt09a.html},
abstract = {An optimization algorithm for minimizing a smooth function over a convex set is described. Each iteration of the method computes a descent direction by minimizing, over the original constraints, a diagonal plus low-rank quadratic approximation to the function. The quadratic approximation is constructed using a limited-memory quasi-Newton update. The method is suitable for large-scale problems where evaluation of the function is substantially more expensive than projection onto the constraint set. Numerical experiments on one-norm regularized test problems indicate that the proposed method is competitive with state-of-the-art methods such as bound-constrained L-BFGS and orthant-wise descent. We further show that the method generalizes to a wide class of problems, and substantially improves on state-of-the-art methods for problems such as learning the structure of Gaussian graphical models and Markov random fields.}
}
%0 Conference Paper
%T Optimizing Costly Functions with Simple Constraints: A Limited-Memory Projected Quasi-Newton Algorithm
%A Mark Schmidt
%A Ewout Berg
%A Michael Friedlander
%A Kevin Murphy
%B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics
%C Proceedings of Machine Learning Research
%D 2009
%E David van Dyk
%E Max Welling
%F pmlr-v5-schmidt09a
%I PMLR
%J Proceedings of Machine Learning Research
%P 456--463
%U http://proceedings.mlr.press
%V 5
%W PMLR
%X An optimization algorithm for minimizing a smooth function over a convex set is described. Each iteration of the method computes a descent direction by minimizing, over the original constraints, a diagonal plus low-rank quadratic approximation to the function. The quadratic approximation is constructed using a limited-memory quasi-Newton update. The method is suitable for large-scale problems where evaluation of the function is substantially more expensive than projection onto the constraint set. Numerical experiments on one-norm regularized test problems indicate that the proposed method is competitive with state-of-the-art methods such as bound-constrained L-BFGS and orthant-wise descent. We further show that the method generalizes to a wide class of problems, and substantially improves on state-of-the-art methods for problems such as learning the structure of Gaussian graphical models and Markov random fields.
TY - CPAPER
TI - Optimizing Costly Functions with Simple Constraints: A Limited-Memory Projected Quasi-Newton Algorithm
AU - Mark Schmidt
AU - Ewout Berg
AU - Michael Friedlander
AU - Kevin Murphy
BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics
PY - 2009/04/15
DA - 2009/04/15
ED - David van Dyk
ED - Max Welling
ID - pmlr-v5-schmidt09a
PB - PMLR
SP - 456
DP - PMLR
EP - 463
L1 - http://proceedings.mlr.press/v5/schmidt09a/schmidt09a.pdf
UR - http://proceedings.mlr.press/v5/schmidt09a.html
AB - An optimization algorithm for minimizing a smooth function over a convex set is described. Each iteration of the method computes a descent direction by minimizing, over the original constraints, a diagonal plus low-rank quadratic approximation to the function. The quadratic approximation is constructed using a limited-memory quasi-Newton update. The method is suitable for large-scale problems where evaluation of the function is substantially more expensive than projection onto the constraint set. Numerical experiments on one-norm regularized test problems indicate that the proposed method is competitive with state-of-the-art methods such as bound-constrained L-BFGS and orthant-wise descent. We further show that the method generalizes to a wide class of problems, and substantially improves on state-of-the-art methods for problems such as learning the structure of Gaussian graphical models and Markov random fields.
ER -
Schmidt, M., Berg, E., Friedlander, M. & Murphy, K.. (2009). Optimizing Costly Functions with Simple Constraints: A Limited-Memory Projected Quasi-Newton Algorithm. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in PMLR 5:456-463
This site last compiled Thu, 17 Aug 2017 09:41:10 +0000