An Enhanced Conjugate Gradient Method for Nonlinear Minimization Problems

Authors

  • Ahmed Anwer Mustafa University of Zakho
  • Hussein Khatab University of Zakho

DOI:

https://doi.org/10.29020/nybg.ejpam.v18i3.6377

Keywords:

Nonlinear , Descent , Convergence

Abstract

Because of their computing efficiency and minimal memory requirements, conjugate gradient techniques are a fundamental family of algorithms for handling large-scale unconstrained nonlinear optimization problems. A new version of the Hestenes-Stiefel (HS) technique is presented in this study with the goal of improving convergence properties without compromising ease of use. We rigorously prove the global convergence qualities of the proposed approach under standard assumptions and show that it meets the conjugacy, descent, and adequate descent constraints. Numerous numerical tests, covering a wide range of benchmark issues, show that the suggested strategy routinely performs better than the traditional HS approach in terms of function evaluations and iteration count.

Author Biography

  • Hussein Khatab, University of Zakho

    Dr. Hussein AgeelĀ  Khatab
    Department of Mathematics

    College of ScienceĀ 

    University of Zakho
    Lecturer
    Iraq-Zakho
    Email: [email protected]

Downloads

Published

2025-08-01

Issue

Section

Optimization

How to Cite

An Enhanced Conjugate Gradient Method for Nonlinear Minimization Problems. (2025). European Journal of Pure and Applied Mathematics, 18(3), 6377. https://doi.org/10.29020/nybg.ejpam.v18i3.6377