[edit]
Obeying the Order: Introducing Ordered Transfer Hyperparameter Optimization
Proceedings of the Fourth International Conference on Automated Machine Learning, PMLR 293:14/1-29, 2025.
Abstract
In many deployed settings, hyperparameters are retuned as more data are collected; for instance tuning a sequence of movie recommendation systems as more movies and rating are added. Despite this, transfer hyperparameter optimisation (HPO) has not been thoroughly analysed in this setting. We introduce ordered transfer hyperparameter optimisation (OTHPO), a version of transfer learning for HPO where the tasks follow a sequential order. Unlike for state-of-the-art transfer HPO, the assumption is that each task is most correlated to those immediately before it. We propose a formal definition and illustrate the key difference with standard transfer HPO approaches. We show how simple methods taking the order into account can outperform more sophisticated transfer methods by better tracking smooth shifts of the hyperparameter landscape. The ten benchmarks are in the setting of gradually accumulating data, as well as a separate real-world motivated optimisation problem, and are open sourced to foster future research on ordered transfer HPO.