Grouped Sequential Optimization Strategy - the Application of Hyperparameter Importance Assessment in Deep Learning

Ruinan Wang, Ian T. Nabney, MOHAMMAD GOLBABAEE
Conference on Parsimony and Learning, PMLR 280:768-779, 2025.

Abstract

Hyperparameter optimization (HPO) is a critical component of machine learning pipelines, significantly affecting model robustness, stability, and generalization. However, HPO is often a time-consuming and computationally intensive task. Traditional HPO methods, such as grid search and random search, often suffer from inefficiency. Bayesian optimization, while more efficient, still struggles with high-dimensional search spaces. In this paper, we contribute to the field by exploring how insights gained from hyperparameter importance assessment (HIA) can be leveraged to accelerate HPO, reducing both time and computational resources. Building on prior work that quantified hyperparameter importance by evaluating 10 hyperparameters on CNNs using 10 common image classification datasets, we implement a novel HPO strategy called ’Sequential Grouping.’ That prior work assessed the importance weights of the investigated hyperparameters based on their influence on model performance, providing valuable insights that we leverage to optimize our HPO process. Our experiments, validated across six additional image classification datasets, demonstrate that incorporating hyperparameter importance assessment (HIA) can significantly accelerate HPO without compromising model performance, reducing optimization time by an average of 31.9% compared to the conventional simultaneous strategy.

Cite this Paper


BibTeX
@InProceedings{pmlr-v280-wang25c, title = {Grouped Sequential Optimization Strategy - the Application of Hyperparameter Importance Assessment in Deep Learning}, author = {Wang, Ruinan and Nabney, Ian T. and GOLBABAEE, MOHAMMAD}, booktitle = {Conference on Parsimony and Learning}, pages = {768--779}, year = {2025}, editor = {Chen, Beidi and Liu, Shijia and Pilanci, Mert and Su, Weijie and Sulam, Jeremias and Wang, Yuxiang and Zhu, Zhihui}, volume = {280}, series = {Proceedings of Machine Learning Research}, month = {24--27 Mar}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v280/main/assets/wang25c/wang25c.pdf}, url = {https://proceedings.mlr.press/v280/wang25c.html}, abstract = {Hyperparameter optimization (HPO) is a critical component of machine learning pipelines, significantly affecting model robustness, stability, and generalization. However, HPO is often a time-consuming and computationally intensive task. Traditional HPO methods, such as grid search and random search, often suffer from inefficiency. Bayesian optimization, while more efficient, still struggles with high-dimensional search spaces. In this paper, we contribute to the field by exploring how insights gained from hyperparameter importance assessment (HIA) can be leveraged to accelerate HPO, reducing both time and computational resources. Building on prior work that quantified hyperparameter importance by evaluating 10 hyperparameters on CNNs using 10 common image classification datasets, we implement a novel HPO strategy called ’Sequential Grouping.’ That prior work assessed the importance weights of the investigated hyperparameters based on their influence on model performance, providing valuable insights that we leverage to optimize our HPO process. Our experiments, validated across six additional image classification datasets, demonstrate that incorporating hyperparameter importance assessment (HIA) can significantly accelerate HPO without compromising model performance, reducing optimization time by an average of 31.9% compared to the conventional simultaneous strategy.} }
Endnote
%0 Conference Paper %T Grouped Sequential Optimization Strategy - the Application of Hyperparameter Importance Assessment in Deep Learning %A Ruinan Wang %A Ian T. Nabney %A MOHAMMAD GOLBABAEE %B Conference on Parsimony and Learning %C Proceedings of Machine Learning Research %D 2025 %E Beidi Chen %E Shijia Liu %E Mert Pilanci %E Weijie Su %E Jeremias Sulam %E Yuxiang Wang %E Zhihui Zhu %F pmlr-v280-wang25c %I PMLR %P 768--779 %U https://proceedings.mlr.press/v280/wang25c.html %V 280 %X Hyperparameter optimization (HPO) is a critical component of machine learning pipelines, significantly affecting model robustness, stability, and generalization. However, HPO is often a time-consuming and computationally intensive task. Traditional HPO methods, such as grid search and random search, often suffer from inefficiency. Bayesian optimization, while more efficient, still struggles with high-dimensional search spaces. In this paper, we contribute to the field by exploring how insights gained from hyperparameter importance assessment (HIA) can be leveraged to accelerate HPO, reducing both time and computational resources. Building on prior work that quantified hyperparameter importance by evaluating 10 hyperparameters on CNNs using 10 common image classification datasets, we implement a novel HPO strategy called ’Sequential Grouping.’ That prior work assessed the importance weights of the investigated hyperparameters based on their influence on model performance, providing valuable insights that we leverage to optimize our HPO process. Our experiments, validated across six additional image classification datasets, demonstrate that incorporating hyperparameter importance assessment (HIA) can significantly accelerate HPO without compromising model performance, reducing optimization time by an average of 31.9% compared to the conventional simultaneous strategy.
APA
Wang, R., Nabney, I.T. & GOLBABAEE, M.. (2025). Grouped Sequential Optimization Strategy - the Application of Hyperparameter Importance Assessment in Deep Learning. Conference on Parsimony and Learning, in Proceedings of Machine Learning Research 280:768-779 Available from https://proceedings.mlr.press/v280/wang25c.html.

Related Material