الملخص
This study proposes “a nonlinear Conjugate gradient algorithm that is widely used in optimization, especially for large scale optimization problems, because it does not require the storage of any matrices algorithm”. This algorithm modifies Hideaki and Yasushi’s (HY) “conjugate gradient algorithm”. It satisfies “a parameterized sufficient descent condition with a parameter ”, which is calculated using the conjugacy condition. The new proposed algorithm always produces descent search directions and it is shown to be convergent under some assumptions. The main idea of this work is to prove the global convergence for the modification nonlinear conjugate gradient method. The statistical results reveal the effectiveness of the proposed algorithm for problems of the given test.