Sains Malaysiana 43(10)(2014): 1591–1597

 

BFGS Method: A New Search Direction

(Kaedah BFGS: Arah Carian Baharu)

 

 

MOHD. ASRUL HERY BIN IBRAHIM1, MUSTAFA MAMAT2* & LEONG WAH JUNE3

 

1Fakulti Keusahawanan dan Perniagaan, Universiti Malaysia Kelantan,

16100 Pengakalan Chepa, Kelantan, Malaysia

 

2Fakulti Informatik dan Komputeran, Universiti  Sultan Zainal Abidin,

21030 Kuala Terengganu, Terengganu, Malaysia

 

3Jabatan Matematik, Fakulti Sains, Universiti Putra Malaysia

43400 Serdang, Selangor, D.E. Malaysia

 

 

Received: 25 April 2013/Accepted: 13 February 2014

 

ABSTRACT

In this paper we present a new line search method known as the HBFGS method, which uses the search direction of the conjugate gradient method with the quasi-Newton updates. The Broyden-Fletcher-Goldfarb-Shanno (BFGS) update is used as approximation of the Hessian for the methods. The new algorithm is compared with the BFGS method in terms of iteration counts and CPU-time. Our numerical analysis provides strong evidence that the proposed HBFGS method is more efficient than the ordinary BFGS method. Besides, we also prove that the new algorithm is globally convergent.

 

Keywords: BFGS method; conjugate gradient method; globally convergent; HBFGS method

 

ABSTRAK

 

Dalam kertas ini kami berikan suatu kaedah carian yang baru dikenali sebagai kaedah HBFGS yang menggunakan arah carian kaedah kecerunan konjugat dengan kemaskini kuasi-Newton. Kemaskini Broyden-Flecther-Goldfarb-Shanno (BFGS) digunakan sebagai formula untuk penghampiran kepada Hessian bagi kedua-dua kaedah. Algoritma baru dibandingkan dengan kaedah kuasi-Newton dalam aspek bilangan lelaran dan juga masa CPU. Keputusan berangka menunjukkan bahawa kaedah HBFGS adalah lebih baik jika dibandingkan dengan kaedah BFGS yang asal. Selain itu, kami juga membuktikan bahawa algoritma baru ini adalah bertumpuan secara sejagat.

 

Kata kunci: Bertumpuan sejagat; kaedah BFGS; kaedah HBFGS; kaedah kecerunan konjugat

REFERENCES

Andrei, N. 2008. An unconstrained optimization test functions collection. Advanced Modelling and Optimization 10(1): 147-161.

Armijo, L. 1966. Minimization of functions having Lipschitz continuous partial derivatives. Pacific Journal of Mathematics 16(1): 1-3.

Birgin, E. & Martínez, J.M. 2001. A spectral conjugate gradient method for unconstrained optimization. Applied Mathematics and Optimization 43: 117-128.

Buckley, A. & Lenir, A. 1983. QN-like variable storage conjugate gradients. Mathematical Programming 27(2): 155-175.

Byrd, R.H. & Nocedal, J. 1989. A tool for the analysis of Quasi-Newton methods with application to unconstrained minimization. SIAM Journal on Numerical Analysis 26(3): 727-739.

Byrd, R.H., Nocedal, J. & Yuan, Y-X. 1987. Global convergence of a class of Quasi-Newton methods on convex problems. SIAM Journal on Numerical Analysis 24 (5): 1171-1191.

Dai, Y-H. 2002. Convergence properties of the BFGS algoritm. SIAM Journal on Optimization 13(3): 693-702.

Dolan, E.D. & Moré. J.J. 2002. Benchmarking optimization software with performance profiles. Mathematical Programming 91(2): 201-213.

Goldstein, A. 1965. On steepest descent. Journal of the Society for Industrial and Applied Mathematics Series A Control 3(1): 147-151.

Hillstrom, E.K. 1977. A simulation test approach to the evaluation of nonlinear optimization algorithm. Journal ACM Trans. Mathematics Software 3(4): 305-315.

Li, D-H. & Fukushima, M. 2001. A modified BFGS method and its global convergence in nonconvex minimization. Journal of Computational and Applied Mathematics 129: 15-24.

Mamat Mustafa, Ismail Mohd, Leong Wah June & Yosza Dasril. 2009. Hybrid Broyden method for unconstrained optimization. International Journal of Numerical Methods and Applications 1(2): 121-130.

Mascarenhas, W.F. 2004. The BFGS method with exact line searches fails for non-convex objective functions. Mathematical Programming 99(1): 49-61.

More, J.J., Garbow, B.S. & Hillstrom, K.E. 1981. Testing unconstrained optimization software. ACM Trans. Math. Softw. 7(1): 17-41.

Nocedal, J. 1980. Updating Quasi-Newton matrices with limited storage. Mathematics of Computation 35(151): 773-782.

Powell, M.J.D. 1976. Some global convergence properties of variable metric algorithm for minimization without exact line searches in nonlinear programming. In SSIAM-AMS Proc IX AMS Providence, edited by Cottle, R.W. & Lemke, C.E. RI: 53-72.

Shi, Z-J. 2006. Convergence of quasi-Newton method with new inexact line search. Journal of Mathematical Analysis and Applications 315(1): 120-131.

Wolfe, P. 1971. Convergence conditions for ASCENT methods. II: Some corrections. SIAM Review 13(2): 185-188.

Wolfe, P. 1969. Convergence conditions for ASCENT methods. SIAM Review 11(2): 226-235.

Zbigniew M. 1996. Genetic Algorithms + Data Structures = Evolution Programs. New York: Springer Verlag.

Zhang, L., Zhou, W.J. & Li, D.H. 2006. A descent modified Polak-Ribiére-Polyak conjugate gradient method and its global convergence. IMA Journal of Numerical Analysis 26: 629-640.

 

 

*Corresponding author; email: must@unisza.edu.my

 

previous