Adaptive Hyperparameter Optimization for Continual Learning Scenarios

Rudy Semola, Julio Hurtado, Vincenzo Lomonaco, Davide Bacciu
Proceedings of the 1st ContinualAI Unconference, 2023, PMLR 249:1-14, 2024.

Abstract

Hyperparameter selection in continual learning scenarios is a challenging and underexplored aspect, especially in practical non-stationary environments. Traditional approaches, such as grid searches with held-out validation data from all tasks, are unrealistic for building accurate lifelong learning systems. This paper aims to explore the role of hyperparameter selection in continual learning and the necessity of continually and automatically tuning them according to the complexity of the task at hand. Hence, we propose leveraging the nature of sequence task learning to improve Hyperparameter Optimization efficiency. By using the functional analysis of variance-based techniques, we identify the most crucial hyperparameters that have an impact on performance. We demonstrate empirically that this approach, agnostic to continual scenarios and strategies, allows us to speed up hyperparameters optimization continually across tasks and exhibit robustness even in the face of varying sequential task orders. We believe that our findings can contribute to the advancement of continual learning methodologies towards more efficient, robust and adaptable models for real-world applications.

Cite this Paper


BibTeX
@InProceedings{pmlr-v249-semola24a, title = {Adaptive Hyperparameter Optimization for Continual Learning Scenarios}, author = {Semola, Rudy and Hurtado, Julio and Lomonaco, Vincenzo and Bacciu, Davide}, booktitle = {Proceedings of the 1st ContinualAI Unconference, 2023}, pages = {1--14}, year = {2024}, editor = {Swaroop, Siddharth and Mundt, Martin and Aljundi, Rahaf and Khan, Mohammad Emtiyaz}, volume = {249}, series = {Proceedings of Machine Learning Research}, month = {09 Oct}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v249/main/assets/semola24a/semola24a.pdf}, url = {https://proceedings.mlr.press/v249/semola24a.html}, abstract = {Hyperparameter selection in continual learning scenarios is a challenging and underexplored aspect, especially in practical non-stationary environments. Traditional approaches, such as grid searches with held-out validation data from all tasks, are unrealistic for building accurate lifelong learning systems. This paper aims to explore the role of hyperparameter selection in continual learning and the necessity of continually and automatically tuning them according to the complexity of the task at hand. Hence, we propose leveraging the nature of sequence task learning to improve Hyperparameter Optimization efficiency. By using the functional analysis of variance-based techniques, we identify the most crucial hyperparameters that have an impact on performance. We demonstrate empirically that this approach, agnostic to continual scenarios and strategies, allows us to speed up hyperparameters optimization continually across tasks and exhibit robustness even in the face of varying sequential task orders. We believe that our findings can contribute to the advancement of continual learning methodologies towards more efficient, robust and adaptable models for real-world applications.} }
Endnote
%0 Conference Paper %T Adaptive Hyperparameter Optimization for Continual Learning Scenarios %A Rudy Semola %A Julio Hurtado %A Vincenzo Lomonaco %A Davide Bacciu %B Proceedings of the 1st ContinualAI Unconference, 2023 %C Proceedings of Machine Learning Research %D 2024 %E Siddharth Swaroop %E Martin Mundt %E Rahaf Aljundi %E Mohammad Emtiyaz Khan %F pmlr-v249-semola24a %I PMLR %P 1--14 %U https://proceedings.mlr.press/v249/semola24a.html %V 249 %X Hyperparameter selection in continual learning scenarios is a challenging and underexplored aspect, especially in practical non-stationary environments. Traditional approaches, such as grid searches with held-out validation data from all tasks, are unrealistic for building accurate lifelong learning systems. This paper aims to explore the role of hyperparameter selection in continual learning and the necessity of continually and automatically tuning them according to the complexity of the task at hand. Hence, we propose leveraging the nature of sequence task learning to improve Hyperparameter Optimization efficiency. By using the functional analysis of variance-based techniques, we identify the most crucial hyperparameters that have an impact on performance. We demonstrate empirically that this approach, agnostic to continual scenarios and strategies, allows us to speed up hyperparameters optimization continually across tasks and exhibit robustness even in the face of varying sequential task orders. We believe that our findings can contribute to the advancement of continual learning methodologies towards more efficient, robust and adaptable models for real-world applications.
APA
Semola, R., Hurtado, J., Lomonaco, V. & Bacciu, D.. (2024). Adaptive Hyperparameter Optimization for Continual Learning Scenarios. Proceedings of the 1st ContinualAI Unconference, 2023, in Proceedings of Machine Learning Research 249:1-14 Available from https://proceedings.mlr.press/v249/semola24a.html.

Related Material