Overcoming Catastrophic Forgetting by Bayesian Generative Regularization

Pei-Hung Chen, Wei Wei, Cho-Jui Hsieh, Bo Dai
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:1760-1770, 2021.

Abstract

In this paper, we propose a new method to over-come catastrophic forgetting by adding generative regularization to Bayesian inference frame-work. Bayesian method provides a general frame-work for continual learning. We could further construct a generative regularization term for all given classification models by leveraging energy-based models and Langevin dynamic sampling to enrich the features learned in each task. By combining discriminative and generative loss together, we empirically show that the proposed method outperforms state-of-the-art methods on a variety of tasks, avoiding catastrophic forgetting in continual learning. In particular, the proposed method outperforms baseline methods over 15%on the Fashion-MNIST dataset and 10%on the CUB dataset.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-chen21v, title = {Overcoming Catastrophic Forgetting by Bayesian Generative Regularization}, author = {Chen, Pei-Hung and Wei, Wei and Hsieh, Cho-Jui and Dai, Bo}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {1760--1770}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/chen21v/chen21v.pdf}, url = {https://proceedings.mlr.press/v139/chen21v.html}, abstract = {In this paper, we propose a new method to over-come catastrophic forgetting by adding generative regularization to Bayesian inference frame-work. Bayesian method provides a general frame-work for continual learning. We could further construct a generative regularization term for all given classification models by leveraging energy-based models and Langevin dynamic sampling to enrich the features learned in each task. By combining discriminative and generative loss together, we empirically show that the proposed method outperforms state-of-the-art methods on a variety of tasks, avoiding catastrophic forgetting in continual learning. In particular, the proposed method outperforms baseline methods over 15%on the Fashion-MNIST dataset and 10%on the CUB dataset.} }
Endnote
%0 Conference Paper %T Overcoming Catastrophic Forgetting by Bayesian Generative Regularization %A Pei-Hung Chen %A Wei Wei %A Cho-Jui Hsieh %A Bo Dai %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-chen21v %I PMLR %P 1760--1770 %U https://proceedings.mlr.press/v139/chen21v.html %V 139 %X In this paper, we propose a new method to over-come catastrophic forgetting by adding generative regularization to Bayesian inference frame-work. Bayesian method provides a general frame-work for continual learning. We could further construct a generative regularization term for all given classification models by leveraging energy-based models and Langevin dynamic sampling to enrich the features learned in each task. By combining discriminative and generative loss together, we empirically show that the proposed method outperforms state-of-the-art methods on a variety of tasks, avoiding catastrophic forgetting in continual learning. In particular, the proposed method outperforms baseline methods over 15%on the Fashion-MNIST dataset and 10%on the CUB dataset.
APA
Chen, P., Wei, W., Hsieh, C. & Dai, B.. (2021). Overcoming Catastrophic Forgetting by Bayesian Generative Regularization. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:1760-1770 Available from https://proceedings.mlr.press/v139/chen21v.html.

Related Material