Glasnik Matematicki, Vol. 52, No. 2 (2017), 361-375.

AN EXTENDED DAI-LIAO CONJUGATE GRADIENT METHOD WITH GLOBAL CONVERGENCE FOR NONCONVEX FUNCTIONS

Mohammad Reza Arazm, Saman Babaie-Kafaki and Reza Ghanbari

Department of Mathematics, Faculty of Mathematics, Statistics and Computer Science, Semnan University, P.O. Box: 35195-363, Semnan, Iran
e-mail: mohamadreza.arazm@semnan.ac.ir

Department of Mathematics, Faculty of Mathematics, Statistics and Computer Science, Semnan University, P.O. Box: 35195-363, Semnan, Iran
e-mail: sbk@semnan.ac.ir

Faculty of Mathematical Sciences, Ferdowsi University of Mashhad, P.O. Box: 9177948953, Mashhad, Iran
e-mail: rghanbari@um.ac.ir


Abstract.   Using an extension of some previously proposed modified secant equations in the Dai-Liao approach, a modified nonlinear conjugate gradient method is proposed. As interesting features, the method employs the objective function values in addition to the gradient information and satisfies the sufficient descent property with proper choices for its parameter. Global convergence of the method is established without convexity assumption on the objective function. Results of numerical comparisons are reported. They demonstrate efficiency of the proposed method in the sense of the Dolan-Moré performance profile.

2010 Mathematics Subject Classification.   65K05, 90C53, 49M37.

Key words and phrases.   Unconstrained optimization, large-scale optimization, conjugate gradient method, sufficient descent property, nonconvexity, global convergence.


Full text (PDF) (free access)

DOI: 10.3336/gm.52.2.12


References:

  1. N. Andrei, Numerical comparison of conjugate gradient algorithms for unconstrained optimization, Stud. Inform. Control 16 (2007), 333-352.

  2. N. Andrei, Open problems in conjugate gradient algorithms for unconstrained optimization, Bull. Malays. Math. Sci. Soc. (2) 34 (2011), 319-330.
    MathSciNet    

  3. S. Babaie-Kafaki, A modified BFGS algorithm based on a hybrid secant equation, Sci. China Math. 54 (2011), 2019-2036.
    MathSciNet     CrossRef

  4. S. Babaie-Kafaki, On the sufficient descent condition of the Hager-Zhang conjugate gradient methods, 4OR 12 (2014), 285-292.
    MathSciNet     CrossRef

  5. S. Babaie-Kafaki and R. Ghanbari, The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices, European J. Oper. Res. 234 (2014), 625-630.
    MathSciNet     CrossRef

  6. S. Babaie-Kafaki and R. Ghanbari, A descent family of Dai-Liao conjugate gradient methods, Optim. Methods Softw. 29 (2014), 583-591.
    MathSciNet     CrossRef

  7. S. Babaie-Kafaki and R. Ghanbari, An extended three-term conjugate gradient method with sufficient descent property, Miskolc Math. Notes 16 (2015), 45-55.
    MathSciNet    

  8. S. Babaie-Kafaki and R. Ghanbari, A hybridization of the Hestenes-Stiefel and Dai-Yuan conjugate gradient methods based on a least-squares approach, Optim. Methods Softw. 30 (2015), 673-681.
    MathSciNet     CrossRef

  9. S. Babaie-Kafaki and R. Ghanbari, Two optimal Dai-Liao conjugate gradient methods, Optimization 64 (2015), 2277-2287.
    MathSciNet     CrossRef

  10. S. Babaie-Kafaki, R. Ghanbari and N. Mahdavi-Amiri, Two new conjugate gradient methods based on modified secant equations, J. Comput. Appl. Math. 234 (2010), 1374-1386.
    MathSciNet     CrossRef

  11. Y.H. Dai and C.X. Kou, A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search, SIAM J. Optim. 23 (2013), 296-320.
    MathSciNet     CrossRef

  12. Y.H. Dai and L.Z. Liao, New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim. 43 (2001), 87-101.
    MathSciNet     CrossRef

  13. E.D. Dolan and J.J. Moré, Benchmarking optimization software with performance profiles, Math. Program. 91 (2002), 201-213.
    MathSciNet     CrossRef

  14. J.A. Ford and I.A. Moghrabi, Multi-step quasi-Newton methods for optimization, J. Comput. Appl. Math. 50 (1994), 305-323.
    MathSciNet     CrossRef

  15. J.A. Ford, Y. Narushima and H. Yabe, Multi-step nonlinear conjugate gradient methods for unconstrained minimization, Comput. Optim. Appl. 40 (2008), 191-216.
    MathSciNet     CrossRef

  16. N.I.M. Gould, D. Orban and Ph.L. Toint, CUTEr: a constrained and unconstrained testing environment, revisited, ACM Trans. Math. Softw. 29 (2003), 373-394.
    MathSciNet     CrossRef

  17. Q. Guo, J.G. Liu and D.H. Wang, A modified BFGS method and its superlinear convergence in nonconvex minimization with general line search rule, J. Appl. Math. Comput. 28 (2008), 435-446.
    MathSciNet     CrossRef

  18. W.W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J. Optim. 16 (2005), 170-192.
    MathSciNet     CrossRef

  19. W.W. Hager and H. Zhang, Algorithm 851: CG-Descent, a conjugate gradient method with guaranteed descent, ACM Trans. Math. Software 32 (2006), 113-137.
    MathSciNet     CrossRef

  20. W.W. Hager and H. Zhang, A survey of nonlinear conjugate gradient methods, Pac. J. Optim. 2 (2006), 35-58.
    MathSciNet    

  21. M.R. Hestenes and E. Stiefel, Methods of conjugate gradients for solving linear systems, J. Research Nat. Bur. Standards 49 (1952), 409-436.
    MathSciNet     CrossRef

  22. D.H. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, J. Comput. Appl. Math. 129 (2001), 15-35.
    MathSciNet     CrossRef

  23. G. Li, C. Tang, and Z. Wei, New conjugacy condition and related new conjugate gradient methods for unconstrained optimization, J. Comput. Appl. Math. 202 (2007), 523-539.
    MathSciNet     CrossRef

  24. D.C. Liu and J. Nocedal, On the limited memory BFGS method for large-scale optimization, Math. Programming 45 (1989), 503-528.
    MathSciNet     CrossRef

  25. M.J.D. Powell, Restart procedures for the conjugate gradient method, Math. Programming 12 (1977), 241-254.
    MathSciNet     CrossRef

  26. M.J.D. Powell, Nonconvex minimization calculations and the conjugate gradient method, in D.F. Griffiths (Ed.), Numerical analysis (Dundee, 1983), Lecture Notes in Math. 1066, Springer, Berlin, 1984, 122-141.
    MathSciNet     CrossRef

  27. K. Sugiki, Y. Narushima, and H. Yabe, Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization, J. Optim. Theory Appl. 153 (2012), 733-757.
    MathSciNet     CrossRef

  28. W. Sun and Y.X. Yuan, Optimization theory and methods. Nonlinear programming, Springer, New York, 2006.
    MathSciNet    

  29. Z. Wei, G. Li and L. Qi, New quasi-Newton methods for unconstrained optimization problems, Appl. Math. Comput. 175 (2006), 1156-1188.
    MathSciNet     CrossRef

  30. P. Wolfe, Convergence conditions for ascent methods, SIAM Rev. 11 (1969), 226-235.
    MathSciNet     CrossRef

  31. C. Xu and J.Z. Zhang, A survey of quasi-Newton equations and quasi-Newton methods for optimization, Ann. Oper. Res. 103 (2001), 213-234.
    MathSciNet     CrossRef

  32. H. Yabe and M. Takano, Global convergence properties of nonlinear conjugate gradient methods with modified secant condition, Comput. Optim. Appl. 28 (2004), 203-225.
    MathSciNet     CrossRef

  33. Y.X. Yuan, A modified BFGS algorithm for unconstrained optimization, IMA J. Numer. Anal. 11 (1991), 325-332.
    MathSciNet     CrossRef

  34. Y.X. Yuan and R.H. Byrd, Non-quasi-Newton updates for unconstrained optimization, J. Comput. Math. 13 (1995), 95-107.
    MathSciNet    

  35. J. Zhang and C. Xu, Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations, J. Comput. Appl. Math. 137 (2001), 269-278.
    MathSciNet     CrossRef

  36. J.Z. Zhang, N.Y. Deng and L.H. Chen, New quasi-Newton equation and related methods for unconstrained optimization, J. Optim. Theory Appl. 102 (1999), 147-167.
    MathSciNet     CrossRef

  37. W. Zhou and L. Zhang, A nonlinear conjugate gradient method based on the MBFGS secant condition, Optim. Methods Softw. 21 (2006), 707-714.
    MathSciNet     CrossRef

Glasnik Matematicki Home Page