New Spectral Idea for Conjugate Gradient Methods and its Global Convergence Theorems
DOI:
https://doi.org/10.29020/nybg.ejpam.v15i2.4364Keywords:
Conjugate gradient, spectral, unconstrained optimization, Global convergence, descent propertyAbstract
Recently, the unconstrained optimization conjugate gradient methods have been widely utilized, especially for problems that are known as large-scale problems. This work proposes a new spectral gradient coefficient obtained from a convex linear combination of two different gradient coefficients to solve unconstrained optimization problems. One of the most essential features of
our suggested strategy is to guarantee the suitable subsidence direction of the line search precision. Furthermore, the proposed strategy is more effective than previous conjugate gradient approaches and stationery, which have been observed in the test problem. However, when it is compared to other conjugate gradient methods, such as FR methods, the proposed method confirmed the globally convergent, indicating that it can be used in scientific data computation.
Downloads
Published
Issue
Section
License
Copyright (c) 2022 European Journal of Pure and Applied Mathematics
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Upon acceptance of an article by the journal, the author(s) accept(s) the transfer of copyright of the article to European Journal of Pure and Applied Mathematics.
European Journal of Pure and Applied Mathematics will be Copyright Holder.