Lifelong Learning with Sketched Structural Regularization

Haoran Li, Aditya Krishnan, Jingfeng Wu, Soheil Kolouri, Praveen K. Pilly, Vladimir Braverman
Proceedings of The 13th Asian Conference on Machine Learning, PMLR 157:985-1000, 2021.

Abstract

Preventing catastrophic forgetting while continually learning new tasks is an essential problem in lifelong learning. Structural regularization (SR) refers to a family of algorithms that mitigate catastrophic forgetting by penalizing the network for changing its “critical parameters" from previous tasks while learning a new one. The penalty is often induced via a quadratic regularizer defined by an \emph{importance matrix}, e.g., the (empirical) Fisher information matrix in the Elastic Weight Consolidation framework. In practice and due to computational constraints, most SR methods crudely approximate the importance matrix by its diagonal. In this paper, we propose \emph{Sketched Structural Regularization} (Sketched SR) as an alternative approach to compress the importance matrices used for regularizing in SR methods. Specifically, we apply \emph{linear sketching methods} to better approximate the importance matrices in SR algorithms. We show that sketched SR: (i) is computationally efficient and straightforward to implement, (ii) provides an approximation error that is justified in theory, and (iii) is method oblivious by construction and can be adapted to any method that belongs to the SR class. We show that our proposed approach consistently improves various SR algorithms’ performance on both synthetic experiments and benchmark continual learning tasks, including permuted-MNIST and CIFAR-100.

Cite this Paper


BibTeX
@InProceedings{pmlr-v157-li21b, title = {Lifelong Learning with Sketched Structural Regularization}, author = {Li, Haoran and Krishnan, Aditya and Wu, Jingfeng and Kolouri, Soheil and Pilly, Praveen K. and Braverman, Vladimir}, booktitle = {Proceedings of The 13th Asian Conference on Machine Learning}, pages = {985--1000}, year = {2021}, editor = {Balasubramanian, Vineeth N. and Tsang, Ivor}, volume = {157}, series = {Proceedings of Machine Learning Research}, month = {17--19 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v157/li21b/li21b.pdf}, url = {https://proceedings.mlr.press/v157/li21b.html}, abstract = {Preventing catastrophic forgetting while continually learning new tasks is an essential problem in lifelong learning. Structural regularization (SR) refers to a family of algorithms that mitigate catastrophic forgetting by penalizing the network for changing its “critical parameters" from previous tasks while learning a new one. The penalty is often induced via a quadratic regularizer defined by an \emph{importance matrix}, e.g., the (empirical) Fisher information matrix in the Elastic Weight Consolidation framework. In practice and due to computational constraints, most SR methods crudely approximate the importance matrix by its diagonal. In this paper, we propose \emph{Sketched Structural Regularization} (Sketched SR) as an alternative approach to compress the importance matrices used for regularizing in SR methods. Specifically, we apply \emph{linear sketching methods} to better approximate the importance matrices in SR algorithms. We show that sketched SR: (i) is computationally efficient and straightforward to implement, (ii) provides an approximation error that is justified in theory, and (iii) is method oblivious by construction and can be adapted to any method that belongs to the SR class. We show that our proposed approach consistently improves various SR algorithms’ performance on both synthetic experiments and benchmark continual learning tasks, including permuted-MNIST and CIFAR-100.} }
Endnote
%0 Conference Paper %T Lifelong Learning with Sketched Structural Regularization %A Haoran Li %A Aditya Krishnan %A Jingfeng Wu %A Soheil Kolouri %A Praveen K. Pilly %A Vladimir Braverman %B Proceedings of The 13th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Vineeth N. Balasubramanian %E Ivor Tsang %F pmlr-v157-li21b %I PMLR %P 985--1000 %U https://proceedings.mlr.press/v157/li21b.html %V 157 %X Preventing catastrophic forgetting while continually learning new tasks is an essential problem in lifelong learning. Structural regularization (SR) refers to a family of algorithms that mitigate catastrophic forgetting by penalizing the network for changing its “critical parameters" from previous tasks while learning a new one. The penalty is often induced via a quadratic regularizer defined by an \emph{importance matrix}, e.g., the (empirical) Fisher information matrix in the Elastic Weight Consolidation framework. In practice and due to computational constraints, most SR methods crudely approximate the importance matrix by its diagonal. In this paper, we propose \emph{Sketched Structural Regularization} (Sketched SR) as an alternative approach to compress the importance matrices used for regularizing in SR methods. Specifically, we apply \emph{linear sketching methods} to better approximate the importance matrices in SR algorithms. We show that sketched SR: (i) is computationally efficient and straightforward to implement, (ii) provides an approximation error that is justified in theory, and (iii) is method oblivious by construction and can be adapted to any method that belongs to the SR class. We show that our proposed approach consistently improves various SR algorithms’ performance on both synthetic experiments and benchmark continual learning tasks, including permuted-MNIST and CIFAR-100.
APA
Li, H., Krishnan, A., Wu, J., Kolouri, S., Pilly, P.K. & Braverman, V.. (2021). Lifelong Learning with Sketched Structural Regularization. Proceedings of The 13th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 157:985-1000 Available from https://proceedings.mlr.press/v157/li21b.html.

Related Material