Alternating Local Enumeration (TnALE): Solving Tensor Network Structure Search with Fewer Evaluations

Chao Li, Junhua Zeng, Chunmei Li, Cesar F Caiafa, Qibin Zhao
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:20384-20411, 2023.

Abstract

Tensor network (TN) is a powerful framework in machine learning, but selecting a good TN model, known as TN structure search (TN-SS), is a challenging and computationally intensive task. The recent approach TNLS (Li et al., 2022) showed promising results for this task. However, its computational efficiency is still unaffordable, requiring too many evaluations of the objective function. We propose TnALE, a surprisingly simple algorithm that updates each structure-related variable alternately by local enumeration, greatly reducing the number of evaluations compared to TNLS. We theoretically investigate the descent steps for TNLS and TnALE, proving that both the algorithms can achieve linear convergence up to a constant if a sufficient reduction of the objective is reached in each neighborhood. We further compare the evaluation efficiency of TNLS and TnALE, revealing that $\Omega(2^K)$ evaluations are typically required in TNLS for reaching the objective reduction, while ideally $O(KR)$ evaluations are sufficient in TnALE, where $K$ denotes the dimension of search space and $R$ reflects the “low-rankness” of the neighborhood. Experimental results verify that TnALE can find practically good TN structures with vastly fewer evaluations than the state-of-the-art algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-li23ar, title = {Alternating Local Enumeration ({T}n{ALE}): Solving Tensor Network Structure Search with Fewer Evaluations}, author = {Li, Chao and Zeng, Junhua and Li, Chunmei and Caiafa, Cesar F and Zhao, Qibin}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {20384--20411}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/li23ar/li23ar.pdf}, url = {https://proceedings.mlr.press/v202/li23ar.html}, abstract = {Tensor network (TN) is a powerful framework in machine learning, but selecting a good TN model, known as TN structure search (TN-SS), is a challenging and computationally intensive task. The recent approach TNLS (Li et al., 2022) showed promising results for this task. However, its computational efficiency is still unaffordable, requiring too many evaluations of the objective function. We propose TnALE, a surprisingly simple algorithm that updates each structure-related variable alternately by local enumeration, greatly reducing the number of evaluations compared to TNLS. We theoretically investigate the descent steps for TNLS and TnALE, proving that both the algorithms can achieve linear convergence up to a constant if a sufficient reduction of the objective is reached in each neighborhood. We further compare the evaluation efficiency of TNLS and TnALE, revealing that $\Omega(2^K)$ evaluations are typically required in TNLS for reaching the objective reduction, while ideally $O(KR)$ evaluations are sufficient in TnALE, where $K$ denotes the dimension of search space and $R$ reflects the “low-rankness” of the neighborhood. Experimental results verify that TnALE can find practically good TN structures with vastly fewer evaluations than the state-of-the-art algorithms.} }
Endnote
%0 Conference Paper %T Alternating Local Enumeration (TnALE): Solving Tensor Network Structure Search with Fewer Evaluations %A Chao Li %A Junhua Zeng %A Chunmei Li %A Cesar F Caiafa %A Qibin Zhao %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-li23ar %I PMLR %P 20384--20411 %U https://proceedings.mlr.press/v202/li23ar.html %V 202 %X Tensor network (TN) is a powerful framework in machine learning, but selecting a good TN model, known as TN structure search (TN-SS), is a challenging and computationally intensive task. The recent approach TNLS (Li et al., 2022) showed promising results for this task. However, its computational efficiency is still unaffordable, requiring too many evaluations of the objective function. We propose TnALE, a surprisingly simple algorithm that updates each structure-related variable alternately by local enumeration, greatly reducing the number of evaluations compared to TNLS. We theoretically investigate the descent steps for TNLS and TnALE, proving that both the algorithms can achieve linear convergence up to a constant if a sufficient reduction of the objective is reached in each neighborhood. We further compare the evaluation efficiency of TNLS and TnALE, revealing that $\Omega(2^K)$ evaluations are typically required in TNLS for reaching the objective reduction, while ideally $O(KR)$ evaluations are sufficient in TnALE, where $K$ denotes the dimension of search space and $R$ reflects the “low-rankness” of the neighborhood. Experimental results verify that TnALE can find practically good TN structures with vastly fewer evaluations than the state-of-the-art algorithms.
APA
Li, C., Zeng, J., Li, C., Caiafa, C.F. & Zhao, Q.. (2023). Alternating Local Enumeration (TnALE): Solving Tensor Network Structure Search with Fewer Evaluations. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:20384-20411 Available from https://proceedings.mlr.press/v202/li23ar.html.

Related Material