Skew-symmetrically perturbed gradient flow for convex optimization

Futoshi Futami, Tomoharu Iwata, Naonori Ueda, Ikko Yamane
Proceedings of The 13th Asian Conference on Machine Learning, PMLR 157:721-736, 2021.

Abstract

Recently, many methods for optimization and sampling have been developed by designing continuous dynamics followed by discretization. The dynamics that have been used for optimization have their corresponding underlying functionals to be minimized. On the other hand, a wider class of dynamics have been studied for sampling, which is not necessarily limited to functional minimization. For example, dynamics perturbed with skew-symmetric matrices, which cannot be seen as minimization of functionals, have been widely used to reduce asymptotic variance. Following this success in sampling, exploring such perturbed dynamics in the context of optimization can open a new avenue to optimization algorithm design. In this work, we introduce a perturbation technique for sampling into optimization for strongly convex functions. We show that perturbation applied to the gradient flow yields rapid convergence in optimization for strongly convex functions. Based on this continuous dynamics, we propose an optimization algorithm for strongly convex functions with a novel discretization framework that combines the Euler method with the leapfrog method which is used in the Hamilton Monte Carlo method. Our numerical experiments show that the perturbation technique is useful for optimization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v157-futami21a, title = {Skew-symmetrically perturbed gradient flow for convex optimization}, author = {Futami, Futoshi and Iwata, Tomoharu and Ueda, Naonori and Yamane, Ikko}, booktitle = {Proceedings of The 13th Asian Conference on Machine Learning}, pages = {721--736}, year = {2021}, editor = {Balasubramanian, Vineeth N. and Tsang, Ivor}, volume = {157}, series = {Proceedings of Machine Learning Research}, month = {17--19 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v157/futami21a/futami21a.pdf}, url = {https://proceedings.mlr.press/v157/futami21a.html}, abstract = {Recently, many methods for optimization and sampling have been developed by designing continuous dynamics followed by discretization. The dynamics that have been used for optimization have their corresponding underlying functionals to be minimized. On the other hand, a wider class of dynamics have been studied for sampling, which is not necessarily limited to functional minimization. For example, dynamics perturbed with skew-symmetric matrices, which cannot be seen as minimization of functionals, have been widely used to reduce asymptotic variance. Following this success in sampling, exploring such perturbed dynamics in the context of optimization can open a new avenue to optimization algorithm design. In this work, we introduce a perturbation technique for sampling into optimization for strongly convex functions. We show that perturbation applied to the gradient flow yields rapid convergence in optimization for strongly convex functions. Based on this continuous dynamics, we propose an optimization algorithm for strongly convex functions with a novel discretization framework that combines the Euler method with the leapfrog method which is used in the Hamilton Monte Carlo method. Our numerical experiments show that the perturbation technique is useful for optimization.} }
Endnote
%0 Conference Paper %T Skew-symmetrically perturbed gradient flow for convex optimization %A Futoshi Futami %A Tomoharu Iwata %A Naonori Ueda %A Ikko Yamane %B Proceedings of The 13th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Vineeth N. Balasubramanian %E Ivor Tsang %F pmlr-v157-futami21a %I PMLR %P 721--736 %U https://proceedings.mlr.press/v157/futami21a.html %V 157 %X Recently, many methods for optimization and sampling have been developed by designing continuous dynamics followed by discretization. The dynamics that have been used for optimization have their corresponding underlying functionals to be minimized. On the other hand, a wider class of dynamics have been studied for sampling, which is not necessarily limited to functional minimization. For example, dynamics perturbed with skew-symmetric matrices, which cannot be seen as minimization of functionals, have been widely used to reduce asymptotic variance. Following this success in sampling, exploring such perturbed dynamics in the context of optimization can open a new avenue to optimization algorithm design. In this work, we introduce a perturbation technique for sampling into optimization for strongly convex functions. We show that perturbation applied to the gradient flow yields rapid convergence in optimization for strongly convex functions. Based on this continuous dynamics, we propose an optimization algorithm for strongly convex functions with a novel discretization framework that combines the Euler method with the leapfrog method which is used in the Hamilton Monte Carlo method. Our numerical experiments show that the perturbation technique is useful for optimization.
APA
Futami, F., Iwata, T., Ueda, N. & Yamane, I.. (2021). Skew-symmetrically perturbed gradient flow for convex optimization. Proceedings of The 13th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 157:721-736 Available from https://proceedings.mlr.press/v157/futami21a.html.

Related Material