Approximate Newton Methods and Their Local Convergence

Haishan Ye, Luo Luo, Zhihua Zhang
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:3931-3939, 2017.

Abstract

Many machine learning models are reformulated as optimization problems. Thus, it is important to solve a large-scale optimization problem in big data applications. Recently, subsampled Newton methods have emerged to attract much attention for optimization due to their efficiency at each iteration, rectified a weakness in the ordinary Newton method of suffering a high cost in each iteration while commanding a high convergence rate. Other efficient stochastic second order methods are also proposed. However, the convergence properties of these methods are still not well understood. There are also several important gaps between the current convergence theory and performance in real applications. In this paper, we aim to fill these gaps. We propose a unifying framework to analyze local convergence properties of second order methods. Based on this framework, our theoretical analysis matches the performance in real applications.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-ye17a, title = {Approximate {N}ewton Methods and Their Local Convergence}, author = {Haishan Ye and Luo Luo and Zhihua Zhang}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {3931--3939}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/ye17a/ye17a.pdf}, url = {https://proceedings.mlr.press/v70/ye17a.html}, abstract = {Many machine learning models are reformulated as optimization problems. Thus, it is important to solve a large-scale optimization problem in big data applications. Recently, subsampled Newton methods have emerged to attract much attention for optimization due to their efficiency at each iteration, rectified a weakness in the ordinary Newton method of suffering a high cost in each iteration while commanding a high convergence rate. Other efficient stochastic second order methods are also proposed. However, the convergence properties of these methods are still not well understood. There are also several important gaps between the current convergence theory and performance in real applications. In this paper, we aim to fill these gaps. We propose a unifying framework to analyze local convergence properties of second order methods. Based on this framework, our theoretical analysis matches the performance in real applications.} }
Endnote
%0 Conference Paper %T Approximate Newton Methods and Their Local Convergence %A Haishan Ye %A Luo Luo %A Zhihua Zhang %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-ye17a %I PMLR %P 3931--3939 %U https://proceedings.mlr.press/v70/ye17a.html %V 70 %X Many machine learning models are reformulated as optimization problems. Thus, it is important to solve a large-scale optimization problem in big data applications. Recently, subsampled Newton methods have emerged to attract much attention for optimization due to their efficiency at each iteration, rectified a weakness in the ordinary Newton method of suffering a high cost in each iteration while commanding a high convergence rate. Other efficient stochastic second order methods are also proposed. However, the convergence properties of these methods are still not well understood. There are also several important gaps between the current convergence theory and performance in real applications. In this paper, we aim to fill these gaps. We propose a unifying framework to analyze local convergence properties of second order methods. Based on this framework, our theoretical analysis matches the performance in real applications.
APA
Ye, H., Luo, L. & Zhang, Z.. (2017). Approximate Newton Methods and Their Local Convergence. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:3931-3939 Available from https://proceedings.mlr.press/v70/ye17a.html.

Related Material