A Globally Convergent Conjugate Gradient Method Incorporating Perry’s Parameter for Unconstrained Optimization

Authors

  • Hasan Jameel Department of Mathematics, College of Basic Education, University of Duhok, Kurdistan Region-Iraq
  • Alaa Luqman Ibrahim Department of Mathematics, College of Science, University of Zakho, Kurdistan Region, Iraq
  • Neven Eoarsh Zaya Management Information System, Administrative Technical Institute, Duhok Polytechnic University, Kurdistan Region-Iraq

DOI:

https://doi.org/10.29020/nybg.ejpam.v18i2.5932

Keywords:

Unconstrained Optimization, Conjugate Gradient Methods, Global Convergence, Numerical Performance

Abstract

To develop a conjugate gradient  method that is both theoretically robust and practically effective for solving unconstrained optimization problems, this paper introduces a novel conjugate gradient method  incorporating Perry's parameter along with the gradient-difference vector, as suggested by Powell, to enhance performance. The proposed method satisfies the descent condition, and its global convergence is established under standard assumptions.
To assess its effectiveness, the method was tested on a diverse set of unconstrained optimization problems and compared against well-known conjugate gradient methods. Numerical experiments indicate that the proposed method outperforms classical approaches in terms of iteration count, function evaluations, and computational time. The results confirm the robustness and efficiency of the proposed method, making it a competitive choice for large-scale optimization problems.

Downloads

Published

2025-05-01

Issue

Section

Optimization

How to Cite

A Globally Convergent Conjugate Gradient Method Incorporating Perry’s Parameter for Unconstrained Optimization. (2025). European Journal of Pure and Applied Mathematics, 18(2), 5932. https://doi.org/10.29020/nybg.ejpam.v18i2.5932