Continual Learning and Private Unlearning

Bo Liu, Qiang Liu, Peter Stone
Proceedings of The 1st Conference on Lifelong Learning Agents, PMLR 199:243-254, 2022.

Abstract

As intelligent agents become autonomous over longer periods of time, they may eventually become lifelong counterparts to specific people. If so, it may be common for a user to want the agent to master a task temporarily but later on to forget the task due to privacy concerns. However enabling an agent to forget privately what the user specified without degrading the rest of the learned knowledge is a challenging problem. With the aim of addressing this challenge, this paper formalizes this continual learning and private unlearning (CLPU) problem. The paper further introduces a straightforward but exactly private solution, CLPU-DER++, as the first step towards solving the CLPU problem, along with a set of carefully designed benchmark problems to evaluate the effectiveness of the proposed solution.

Cite this Paper


BibTeX
@InProceedings{pmlr-v199-liu22a, title = {Continual Learning and Private Unlearning}, author = {Liu, Bo and Liu, Qiang and Stone, Peter}, booktitle = {Proceedings of The 1st Conference on Lifelong Learning Agents}, pages = {243--254}, year = {2022}, editor = {Chandar, Sarath and Pascanu, Razvan and Precup, Doina}, volume = {199}, series = {Proceedings of Machine Learning Research}, month = {22--24 Aug}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v199/liu22a/liu22a.pdf}, url = {https://proceedings.mlr.press/v199/liu22a.html}, abstract = {As intelligent agents become autonomous over longer periods of time, they may eventually become lifelong counterparts to specific people. If so, it may be common for a user to want the agent to master a task temporarily but later on to forget the task due to privacy concerns. However enabling an agent to forget privately what the user specified without degrading the rest of the learned knowledge is a challenging problem. With the aim of addressing this challenge, this paper formalizes this continual learning and private unlearning (CLPU) problem. The paper further introduces a straightforward but exactly private solution, CLPU-DER++, as the first step towards solving the CLPU problem, along with a set of carefully designed benchmark problems to evaluate the effectiveness of the proposed solution.} }
Endnote
%0 Conference Paper %T Continual Learning and Private Unlearning %A Bo Liu %A Qiang Liu %A Peter Stone %B Proceedings of The 1st Conference on Lifelong Learning Agents %C Proceedings of Machine Learning Research %D 2022 %E Sarath Chandar %E Razvan Pascanu %E Doina Precup %F pmlr-v199-liu22a %I PMLR %P 243--254 %U https://proceedings.mlr.press/v199/liu22a.html %V 199 %X As intelligent agents become autonomous over longer periods of time, they may eventually become lifelong counterparts to specific people. If so, it may be common for a user to want the agent to master a task temporarily but later on to forget the task due to privacy concerns. However enabling an agent to forget privately what the user specified without degrading the rest of the learned knowledge is a challenging problem. With the aim of addressing this challenge, this paper formalizes this continual learning and private unlearning (CLPU) problem. The paper further introduces a straightforward but exactly private solution, CLPU-DER++, as the first step towards solving the CLPU problem, along with a set of carefully designed benchmark problems to evaluate the effectiveness of the proposed solution.
APA
Liu, B., Liu, Q. & Stone, P.. (2022). Continual Learning and Private Unlearning. Proceedings of The 1st Conference on Lifelong Learning Agents, in Proceedings of Machine Learning Research 199:243-254 Available from https://proceedings.mlr.press/v199/liu22a.html.

Related Material