Adaptive Hyperheuristic Framework for Hyperparameter Tuning: A Q-Learning-Based Heuristic Selection Approach with Simulated Annealing Acceptance Criteria

Authors

DOI:

https://doi.org/10.29020/nybg.ejpam.v18i3.6348

Keywords:

Hyperheuristics, Hyperparameter Tuning, Q-Learning, Simulated Annealing, Machine Learning Optimization

Abstract

Hyperparameter tuning is a crucial step in optimizing machine learning models, directly impacting their performance and generalization capabilities. Traditional approaches, such as grid search, random search, and Bayesian optimization, often suffer from inefficiencies, especially in high-dimensional hyperparameter spaces. To address these challenges, this paper proposes an adaptive hyperheuristic framework for hyperparameter tuning, integrating Q-learning-based heuristic selection and simulated annealing acceptance criteria. The proposed  model is referred to as\textit{AHPQA framework} in this article. The framework employs a two-layered approach: a high-level heuristic selection strategy driven by Q-learning, and a set of low-level heuristics categorized into constructive, improvement, and perturbation heuristics. The Q-learning model dynamically selects the most effective heuristic based on historical performance, ensuring an adaptive exploration-exploitation balance. Additionally, the acceptance of new hyperparameter configurations follows a simulated annealing-based probabilistic function, allowing the search process to escape local optima. The proposed method is evaluated on benchmark machine learning models, including deep learning architectures and ensemble classifiers, using publicly available datasets. Comparative analysis against conventional tuning approaches demonstrates superior convergence speed, computational efficiency, and model performance. The results indicate that the adaptive hyperheuristic approach significantly reduces the computational overhead while achieving competitive or improved model accuracy. This study contributes a novel hyperheuristic-based optimization framework for hyperparameter tuning, providing a scalable, adaptable, and efficient solution applicable across various machine learning domains. Future research directions include extending the framework to reinforcement learning environments and integrating explainable AI techniques for improved interpretability.

Author Biographies

  • Kassem Danach, Basic and Applied Sciences Research Center, Al Maaref University, Beirut, Lebanon

    Kassem Danach} is an Associate Professor and Chairperson of the MIT Department at Al Maaref University. He holds a Ph.D. in Computer and Telecommunication Engineering from École Centrale de Lille. His research focuses on artificial intelligence, combinatorial optimization, and intelligent decision support systems. He holds several patents in applied AI, particularly in healthcare and behavioral analytics. He actively supervises graduate research in AI and data science. He is also a member of various scientific committees and editorial boards

  • Wael Hosny Fouad Aly, Basic and Applied Sciences Research Center, Al Maaref University, Beirut, Lebanon

    Dr. Wael Hosny Fouad Aly has received his Ph.D. degree at the University of Western Ontario in
    Canada in 2006. Dr. Aly is a Professional Engineer of Ontario P.Eng. (Canada). Dr. Aly is currently
    working as a Professor of Computer Engineering at the College of Engineering and technology at the
    American University of the Middle East in Kuwait since 2016. Dr. Aly’s research interests include
    SDN networking, distributed systems, Optical Burst Switching (OBS), Wireless Sensor Networks
    (WSN), Differentiated Services, and Multi-Agent systems. He is a senior member of the IEEE and
    the IEEE Computer Society. Dr. Wael Aly is an ABET PEV (EAC/CAC). He can be contacted at email: [email protected]

Downloads

Published

2025-08-01

Issue

Section

Optimization

How to Cite

Adaptive Hyperheuristic Framework for Hyperparameter Tuning: A Q-Learning-Based Heuristic Selection Approach with Simulated Annealing Acceptance Criteria. (2025). European Journal of Pure and Applied Mathematics, 18(3), 6348. https://doi.org/10.29020/nybg.ejpam.v18i3.6348