Modifying Spectral Conjugate Gradient Method for Solving Unconstrained Optimization Problems

Authors

  • Kamal Murad University of Zakho
  • Salah Shareef University of Zakho

DOI:

https://doi.org/10.29020/nybg.ejpam.v18i3.6145

Keywords:

Unconstrained Optimization, Descent Conditions, Spectral Conjugate Gradient, Global Convergence

Abstract

This paper proposes a new spectral conjugate gradient method for large-scale unconstrained optimization, designed to improve convergence efficiency by reducing both iteration counts and function evaluations. The method introduces a modified spectral coefficient and a new search direction formula that guarantees descent and sufficient descent conditions at every iteration, with-
out increasing the per-iteration computational burden. Unlike existing methods such as the classical conjugate gradient algorithm, the proposed scheme integrates spectral scaling in a way that enhances direction quality and step stability. Theoretical analysis establishes global convergence under standard assumptions. Extensive numerical experiments on a diverse set of test problems demonstrate the superior performance of the proposed method over the classical conjugate gradient and spectral conjugate gradient methods, particularly in scenarios where fast convergence and low evaluation cost are critical. These results suggest that the proposed method offers a robust and computationally efficient alternative for solving unconstrained optimization problems. Future work will explore its application to structured and real-world large-scale problems.

Downloads

Published

2025-08-01

Issue

Section

Optimization

How to Cite

Modifying Spectral Conjugate Gradient Method for Solving Unconstrained Optimization Problems. (2025). European Journal of Pure and Applied Mathematics, 18(3), 6145. https://doi.org/10.29020/nybg.ejpam.v18i3.6145