Quantcast

Another hybrid conjugate gradient algorithm for unconstrained optimization

Research paper by Neculai Andrei

Indexed on: 19 Jan '08Published on: 19 Jan '08Published in: Numerical Algorithms



Abstract

Another hybrid conjugate gradient algorithm is subject to analysis. The parameter βk is computed as a convex combination of \( \beta ^{{HS}}_{k} \) (Hestenes-Stiefel) and \( \beta ^{{DY}}_{k} \) (Dai-Yuan) algorithms, i.e. \( \beta ^{C}_{k} = {\left( {1 - \theta _{k} } \right)}\beta ^{{HS}}_{k} + \theta _{k} \beta ^{{DY}}_{k} \). The parameter θk in the convex combination is computed in such a way so that the direction corresponding to the conjugate gradient algorithm to be the Newton direction and the pair (sk, yk) to satisfy the quasi-Newton equation \( \nabla ^{2} f{\left( {x_{{k + 1}} } \right)}s_{k} = y_{k} \), where \( s_{k} = x_{{k + 1}} - x_{k} \) and \( y_{k} = g_{{k + 1}} - g_{k} \). The algorithm uses the standard Wolfe line search conditions. Numerical comparisons with conjugate gradient algorithms show that this hybrid computational scheme outperforms the Hestenes-Stiefel and the Dai-Yuan conjugate gradient algorithms as well as the hybrid conjugate gradient algorithms of Dai and Yuan. A set of 750 unconstrained optimization problems are used, some of them from the CUTE library.