Continual Learning with Guarantees via Weight Interval Constraints

Maciej Wołczyk, Karol Piczak, Bartosz Wójcik, Lukasz Pustelnik, Paweł Morawiecki, Jacek Tabor, Tomasz Trzcinski, Przemysław Spurek
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:23897-23911, 2022.

Abstract

We introduce a new training paradigm that enforces interval constraints on neural network parameter space to control forgetting. Contemporary Continual Learning (CL) methods focus on training neural networks efficiently from a stream of data, while reducing the negative impact of catastrophic forgetting, yet they do not provide any firm guarantees that network performance will not deteriorate uncontrollably over time. In this work, we show how to put bounds on forgetting by reformulating continual learning of a model as a continual contraction of its parameter space. To that end, we propose Hyperrectangle Training, a new training methodology where each task is represented by a hyperrectangle in the parameter space, fully contained in the hyperrectangles of the previous tasks. This formulation reduces the NP-hard CL problem back to polynomial time while providing full resilience against forgetting. We validate our claim by developing InterContiNet (Interval Continual Learning) algorithm which leverages interval arithmetic to effectively model parameter regions as hyperrectangles. Through experimental results, we show that our approach performs well in a continual learning setup without storing data from previous tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-wolczyk22a, title = {Continual Learning with Guarantees via Weight Interval Constraints}, author = {Wo{\l}czyk, Maciej and Piczak, Karol and W{\'o}jcik, Bartosz and Pustelnik, Lukasz and Morawiecki, Pawe{\l} and Tabor, Jacek and Trzcinski, Tomasz and Spurek, Przemys{\l}aw}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {23897--23911}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/wolczyk22a/wolczyk22a.pdf}, url = {https://proceedings.mlr.press/v162/wolczyk22a.html}, abstract = {We introduce a new training paradigm that enforces interval constraints on neural network parameter space to control forgetting. Contemporary Continual Learning (CL) methods focus on training neural networks efficiently from a stream of data, while reducing the negative impact of catastrophic forgetting, yet they do not provide any firm guarantees that network performance will not deteriorate uncontrollably over time. In this work, we show how to put bounds on forgetting by reformulating continual learning of a model as a continual contraction of its parameter space. To that end, we propose Hyperrectangle Training, a new training methodology where each task is represented by a hyperrectangle in the parameter space, fully contained in the hyperrectangles of the previous tasks. This formulation reduces the NP-hard CL problem back to polynomial time while providing full resilience against forgetting. We validate our claim by developing InterContiNet (Interval Continual Learning) algorithm which leverages interval arithmetic to effectively model parameter regions as hyperrectangles. Through experimental results, we show that our approach performs well in a continual learning setup without storing data from previous tasks.} }
Endnote
%0 Conference Paper %T Continual Learning with Guarantees via Weight Interval Constraints %A Maciej Wołczyk %A Karol Piczak %A Bartosz Wójcik %A Lukasz Pustelnik %A Paweł Morawiecki %A Jacek Tabor %A Tomasz Trzcinski %A Przemysław Spurek %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-wolczyk22a %I PMLR %P 23897--23911 %U https://proceedings.mlr.press/v162/wolczyk22a.html %V 162 %X We introduce a new training paradigm that enforces interval constraints on neural network parameter space to control forgetting. Contemporary Continual Learning (CL) methods focus on training neural networks efficiently from a stream of data, while reducing the negative impact of catastrophic forgetting, yet they do not provide any firm guarantees that network performance will not deteriorate uncontrollably over time. In this work, we show how to put bounds on forgetting by reformulating continual learning of a model as a continual contraction of its parameter space. To that end, we propose Hyperrectangle Training, a new training methodology where each task is represented by a hyperrectangle in the parameter space, fully contained in the hyperrectangles of the previous tasks. This formulation reduces the NP-hard CL problem back to polynomial time while providing full resilience against forgetting. We validate our claim by developing InterContiNet (Interval Continual Learning) algorithm which leverages interval arithmetic to effectively model parameter regions as hyperrectangles. Through experimental results, we show that our approach performs well in a continual learning setup without storing data from previous tasks.
APA
Wołczyk, M., Piczak, K., Wójcik, B., Pustelnik, L., Morawiecki, P., Tabor, J., Trzcinski, T. & Spurek, P.. (2022). Continual Learning with Guarantees via Weight Interval Constraints. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:23897-23911 Available from https://proceedings.mlr.press/v162/wolczyk22a.html.

Related Material