A Hybrid Conjugate Gradient Method Between MLS and FR in Nonparametric Statistics

Document Type : Original paper

Authors

1 Laboratory Informatics and Mathematics (LIM) Mohamed Cherif Messaadia University, Souk Ahras, Algeria

2 Mohamed Cherif Messaadia University, Souk Ahras, 41000, Algeria

Abstract

This paper proposes a novel hybrid conjugate gradient method for nonparametric statistical inference.The proposed method is a convex combination of the modified linear search (MLS) and Fletcher-Reeves (FR) methods, and it inherits the advantages of both methods. The FR method is known for its fast convergence, while the MLS method is known for its
robustness to noise. The proposed method combines these advantages to achieve both fast convergence and robustness to noise. Our method is evaluated on a variety of nonparametric statistical problems, including kernel density estimation, regression, and classification. The results show that the new method outperforms the MLS and FR methods in terms of both accuracy and efficiency.

Keywords

Main Subjects


[1] M. Al-Baali, Descent property and global convergence of the Fletcher-Reeves method with inexact line search, IMA J. Numer. Anal. 5 (1985), no. 1, 121–124.
https://doi.org/10.1093/imanum/5.1.121
[2] N. Andrei, A hybrid conjugate gradient algorithm for unconstrained optimization as a convex combination of hestenes-stiefel and dai-yuan, Studies in Informatics and Control 17 (2008), no. 1, 55–70.
[3] N. Andrei, Hybrid conjugate gradient algorithm for unconstrained optimization, J. Optim. Theory Appl. 141 (2008), no. 2, 249–264.
https://doi.org/10.1007/s10957-008-9505-0
[4] N. Andrei, An unconstrained optimization test functions collection, Adv. Model. Optim 10 (2008), no. 1, 147–161.
[5] N. Andrei, Nonlinear Conjugate Gradient Methods for Unconstrained Optimization, Springer Cham, 2020.
[6] N. Andrei, Modern Numerical Nonlinear Optimization, vol. 195, Springer Cham, 2022.
[7] I. Bongartz, A.R. Conn, N. Gould, and P.L. Toint, Cute: Constrained and unconstrained testing environment, ACM Trans. Math. Software 21 (1995), no. 1, 123–160.
https://doi.org/10.1145/200979.201043
[8] Y.H. Dai, J. Han, G.H. Liu, D. Sun, H. Yin, and Y. Yuan, Convergence properties of nonlinear conjugate gradient methods, SIAM J. Optim. 10 (2000), no. 2, 345–358.
https://doi.org/10.1137/S1052623494268443
[9] Y.H. Dai and Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim. 10 (1999), no. 1, 177–182.
https://doi.org/10.1137/S1052623497318992
[10] S.S. Djordjević, New hybrid conjugate gradient method as a convex combination of LS and CD methods, Filomat 31 (2017), no. 6, 1813–1825.
http://www.jstor.org/stable/24902273
[11] S.S. Djordjević, New hybrid conjugate gradient method as a convex combination of LS and FR methods, Acta Math. Sin. 39 (2019), 214–228.
https://doi.org/10.1007/s10473-019-0117-6
[12] E.D. Dolan and J.J. Moré, Benchmarking optimization software with performance profiles, Math. Program. 91 (2002), 201–213.
https://doi.org/10.1007/s101070100263
[13] W.F. Eddy, Optimum kernel estimators of the mode, Ann. Statist. 8 (1980), no. 4, 870 – 882.
https://doi.org/10.1214/aos/1176345080
[14] R. Fletcher, Practical Methods of Optimization, Second ed., John Wiley & Sons, New York, 2000.
[15] R. Fletcher and C.M. Reeves, Function minimization by conjugate gradients, Comput. J. 7 (1964), no. 2, 149–154.
https://doi.org/10.1093/comjnl/7.2.149
[16] M.R. Hestenes and E.L. Stiefel, Methods of conjugate gradients for solving linear systems, J. Res. Natl. Inst. Stan. 49 (1952), no. 6, 409–436.
[17] H. Huang, Z. Wei, and Y. Shengwei, The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong wolfe–powell line search, Appl. Math. Comput. 189 (2007), no. 2, 1241–1245.
https://doi.org/10.1016/j.amc.2006.12.006
[18] V.D. Konakov, On the asymptotic normality of the mode of multidimensional distributions, Theory Probab. Appl. 18 (1974), no. 4, 794–799.
[19] J.K. Liu and S.J. Li, New hybrid conjugate gradient method for unconstrained optimization, Appl. Math. Comput. 245 (2014), 36–43.
https://doi.org/10.1016/j.amc.2014.07.096
[20] Y. Liu and C. Storey, Efficient generalized conjugate gradient algorithms, part 1: theory, J. Optim. Theory Appl. 69 (1991), 129–137.
https://doi.org/10.1007/BF00940464
[21] E. Parzen, On estimation of a probability density function and mode, Ann. Math. Statist. 33 (1962), no. 3, 1065–1076
http://www.jstor.org/stable/2237880
[22] E. Polak and G. Ribiere, Notesurla convergence de directions conjugu´ees, rev, Francaise Informat Recherche Operationelle. 3e Anne 16 (1969), no. 3, 35–43.
https://doi.org/10.1051/m2an/196903R100351
[23] B.T. Polyak, The conjugate gradient method in extremal problems, USSR Comp. Math. Math. Phys. 9 (1969), no. 4, 94–112.
https://doi.org/10.1016/0041-5553(69)90035-4
[24] T.W. Sager, An iterative method for estimating a multivariate mode and isopleth, J. Amer. Statist. Assoc. 74 (1979), no. 366a, 329–339.
https://doi.org/10.1080/01621459.1979.10482514
[25] M. Samanta, Nonparametric estimation of the mode of a multivariate density, South African Statist. J. 7 (1973), no. 2, 109–117.
https://hdl.handle.net/10520/AJA0038271X1_65
[26] Y. Shengwei, Z. Wei, and H. Huang, A note about WYL’s conjugate gradient method and its applications, Appl. Math. Comput. 191 (2007), no. 2, 381–388.
https://doi.org/10.1016/j.amc.2007.02.094
[27] D. Touati-Ahmed and C. Storey, Efficient hybrid conjugate gradient techniques, J. Optim. Theory Appl. 64 (1990), 379–397.
https://doi.org/10.1007/BF00939455
[28] Z. Wei, S. Yao, and L. Liu, The convergence properties of some new conjugate gradient methods, Appl. Math. Comput. 183 (2006), no. 2, 1341–1350.
https://doi.org/10.1016/j.amc.2006.05.150
[29] A. Zhou, Z. Zhu, H. Fan, and Q. Qing, Three new hybrid conjugate gradient methods for optimization, Appl. Math. 2 (2011), no. 3, 303–308.
http://dx.doi.org/10.4236/am.2011.23035
[30] G. Zoutendijk, Nonlinear programming, computational methods, Integer and Nonlinear Programming (J. Abadie, ed.), North-Holland, Amsterdam, 1970, pp. 37–86.