Subscribe Now Subscribe Today
Science Alert
 
FOLLOW US:     Facebook     Twitter
Blue
   
Curve Top
Asian Journal of Scientific Research
  Year: 2018 | Volume: 11 | Issue: 2 | Page No.: 185-194
DOI: 10.3923/ajsr.2018.185.194
A Five-term Hybrid Conjugate Gradient Method with Global Convergence and Descent Properties for Unconstrained Optimization Problems
Olawale Joshua Adeleke and Idowu Ademola Osinuga

Abstract:
Background and Objective: The nonlinear conjugate gradient method is a recurrence technique for solving effectively large-scale unconstrained optimization problems. In this study, a new hybrid nonlinear conjugate gradient method that combines the features of 5 different conjugate gradient methods is proposed with the aim of combining the positive features of different non-hybrid methods. Methodology: The proposed method was able to generate descent directions independent of line search procedures. By making assumptions on the objective function, the global convergence of the method was established under the standard Wolfe line search conditions. Results: Preliminary results showed that the method is very competitive and promising when subjected to comparison with other non-hybrid methods based on numerical experiments with selected benchmark test functions. Conclusion: As a future study, the proposed method will be tested against recently proposed related methods.
 [Fulltext PDF]   [Fulltext HTML]   [XML: Abstract + References]   [References]   [View Citation]  [Report Citation]
How to cite this article:

Olawale Joshua Adeleke and Idowu Ademola Osinuga, 2018. A Five-term Hybrid Conjugate Gradient Method with Global Convergence and Descent Properties for Unconstrained Optimization Problems. Asian Journal of Scientific Research, 11: 185-194.

DOI: 10.3923/ajsr.2018.185.194

URL: https://scialert.net/abstract/?doi=ajsr.2018.185.194

 
COMMENT ON THIS PAPER
 
 
 

 

 
 
 
 
 
 
 
 
 

 
 
 
 
 
 
 
 

Curve Bottom