Overcoming Multi-model Forgetting

Yassine Benyahia, Kaicheng Yu, Kamil Bennani Smires, Martin Jaggi, Anthony C. Davison, Mathieu Salzmann, Claudiu Musat
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:594-603, 2019.

Abstract

We identify a phenomenon, which we refer to as multi-model forgetting, that occurs when sequentially training multiple deep networks with partially-shared parameters; the performance of previously-trained models degrades as one optimizes a subsequent one, due to the overwriting of shared parameters. To overcome this, we introduce a statistically-justified weight plasticity loss that regularizes the learning of a model’s shared parameters according to their importance for the previous models, and demonstrate its effectiveness when training two models sequentially and for neural architecture search. Adding weight plasticity in neural architecture search preserves the best models to the end of the search and yields improved results in both natural language processing and computer vision tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-benyahia19a, title = {Overcoming Multi-model Forgetting}, author = {Benyahia, Yassine and Yu, Kaicheng and Smires, Kamil Bennani and Jaggi, Martin and Davison, Anthony C. and Salzmann, Mathieu and Musat, Claudiu}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {594--603}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/benyahia19a/benyahia19a.pdf}, url = {https://proceedings.mlr.press/v97/benyahia19a.html}, abstract = {We identify a phenomenon, which we refer to as multi-model forgetting, that occurs when sequentially training multiple deep networks with partially-shared parameters; the performance of previously-trained models degrades as one optimizes a subsequent one, due to the overwriting of shared parameters. To overcome this, we introduce a statistically-justified weight plasticity loss that regularizes the learning of a model’s shared parameters according to their importance for the previous models, and demonstrate its effectiveness when training two models sequentially and for neural architecture search. Adding weight plasticity in neural architecture search preserves the best models to the end of the search and yields improved results in both natural language processing and computer vision tasks.} }
Endnote
%0 Conference Paper %T Overcoming Multi-model Forgetting %A Yassine Benyahia %A Kaicheng Yu %A Kamil Bennani Smires %A Martin Jaggi %A Anthony C. Davison %A Mathieu Salzmann %A Claudiu Musat %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-benyahia19a %I PMLR %P 594--603 %U https://proceedings.mlr.press/v97/benyahia19a.html %V 97 %X We identify a phenomenon, which we refer to as multi-model forgetting, that occurs when sequentially training multiple deep networks with partially-shared parameters; the performance of previously-trained models degrades as one optimizes a subsequent one, due to the overwriting of shared parameters. To overcome this, we introduce a statistically-justified weight plasticity loss that regularizes the learning of a model’s shared parameters according to their importance for the previous models, and demonstrate its effectiveness when training two models sequentially and for neural architecture search. Adding weight plasticity in neural architecture search preserves the best models to the end of the search and yields improved results in both natural language processing and computer vision tasks.
APA
Benyahia, Y., Yu, K., Smires, K.B., Jaggi, M., Davison, A.C., Salzmann, M. & Musat, C.. (2019). Overcoming Multi-model Forgetting. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:594-603 Available from https://proceedings.mlr.press/v97/benyahia19a.html.

Related Material