An Efficient Spectral Parameter for Accelerating the Conjugate Gradient Algorithm

Authors

  • Zyiad Adrees Hamad1 University of Mosul, College of Computer Science and Mathematics, Department of Mathematics
  • Ghada Moayid Al-Naemi

DOI:

https://doi.org/10.54153/sjpas.2025.v7i1.987

Keywords:

Conjugate gradient algorithms, Strong Wolfe Conditions, Numerical analysis, Spectral conjugate gradient, Inexact line search

Abstract

Conjugate gradient (CG) methods are renowned for their efficiency in optimizing problems due to their speed and memory use. However, certain formulations of CG methods frequently change between different parameters, potentially omitting valuable values. Additionally, proving adequate descent properties in these methods often relies on a strategy that assumes the ability to exclude specific function values. This study introduces a novel spectral CG (SCG) algorithm that integrates both the spectral gradient parameter and conjugate gradient coefficient. This method stands out for its applicability to large-scale unconstrained optimization problems. Initial numerical findings employing a robust Wolfe line search are provided to showcase the method's efficiency. The spectral coefficient in the field of optimization is a concept used to enhance the performance of search algorithms in unconstrained optimization. It is typically employed in the context of line search methods to improve the convergence rate of algorithms. In optimization algorithms, the spectral coefficient helps adjust the step size to achieve faster convergence towards the optimal solution. The coefficient is calculated based on information from previous iterations of the algorithm. A common application of this concept is in the Spectral Gradient method, where the spectral coefficient is used to modify the gradient direction to accelerate the search process. In this method, the spectral coefficient is used to determine the step length in each iteration, aiming to achieve an optimal balance between speed and stability in finding the optimal solution

References

1. M. R. Hestenes and E. Stiefel. Methods of Conjugate Gradients for Solving Linear Systems: Journal of Research of The National Bureau of Standards, 49, 409–435, 1952.

2. R, Fletcher. and Reeves, C. M", Function minimization by conjugate gradients”, The Computer Journal, 7(1964), 149-154.

3. E. Polak, and G. Ribiere, (1969) ", Note sur la convergence de méthodes de directions Conjuguées" ESAIM: Mathematical Modelling and Numerical Analysis-Modélisation Mathématique et Analyse Numérique ,3(1969),35-43.

4. B. T, Polyak, "The conjugate gradient method in extreme problems”, USSR Computational Mathematics and Mathematical Physics, 9(1969), 94-112.

5. Y. H, Dai, and Yuan, Y ", A nonlinear conjugate gradient method with a strong global convergence property". SIAM Journal on optimization, 10(1999), 177-182.

6. Salihu, N., Kumam, P., Sulaiman, I. M., and Seangwattana, T (2023). An efficient spectral minimization of the dai-yuan method with application to image reconstruction. AIMS Mathematics 8(12):30940–30962.

7. Perry, A. (1978). A modified conjugate gradient algorithm. Oper. Res. Tech. Notes,26(6): 1073-1078.

8. H. Dai, Y., & -Z. Liao, L. (2001). New conjugacy conditions and related nonlinear conjugate gradient methods. Applied Mathematics and optimization, 43, 87-101.

9. Andrei, N. (2008). An unconstrained optimization test functions collection, Advanced Modeling and Optimization, 10, 147-161.

10. Barzilai, J. & Borwein, J.M. (1988). Two-point step size gradient methods, IMA J Numer Anal. 8, 141- 148.

11. Birgin, E.G. & Martinez, J. M. (2011). A spectral conjugate gradient method for unconstrained optimization, Applied Mathematics and Optimization, 43 (2), 117-128

12. J. Barazilai and J. M. Borwein, 1988. Two-point step size gradient methods, IMA Journal of Numerical Analysis, 8 141-148.

13. M. Raydan, 1997, The Barazilai and Borwein gradient method for the large scale unconstrained minimization problem, SIAM Journal on Optimization, 7 26-33.

14. E. Birgin and M. J. Martines, 2001, A spectral conjugate gradient for unconstrained optimization. Applied Mathematical Optimization .43 117-128.

15. N. Andrei, 2007, Scaled conjugate gradient algorithms for unconstrained optimization. Computational Optimization and Applications, .38 401-416.

16. H. Jiang, S. Deng, X, Zheng, and Z. Wan, 2012, Global gradient method. Journal of Applied Mathematics.

17. L. Zhang, W. Zhou, and D. Li, 2006, A descent modified Polak-Ribiere-Polyak conjugate method and its global convergence. IMA Journal of Numerical Analysis, 26 629-640.

18. M. Dawahdeh, I. M. Sulaiman, M. Rivaie, and M. Mamat, 2020, A new spectral conjugate gradient method with strong Wolfe-Powell line search. International Journal of Emerging Trends in Engineering Research, 8 391-397.

19. G.M. Al-Naemi and A.H. Sheekoo, 2021, Global convergence Condition for a New Spectral Conjugate Gradient Method for Large-Scale Optimization, Ibn Al-Haitham International Conference for Pure and Applied Sciences (IHICPS), Journal of Physics: Conference Series,1879(2021), 1-9

20. Ghada M. Al-Naemi and Ahmed H. Sheekoo,(2021), New Scaled Algorithm for Non-linear Conjugate gradients in Unconstrained Optimization, Indonesian Journal of Electrical Engineering and Computer Science, 24, 3,1589-1595.

21. Ahmed Hussien Sheekoo and Ghada M. Al-Naemi , (2021), Good Characteristics of The New Spectral Conjugate Gradient Method for Unconstrained Optimization, 2nd International Conference on Physics and Applied Sciences (ICPAS 2021), Journal of Physics: Conference Series, 1963(2021), 1-10.

22. M. Loreto ,T. Humphries, C. Raghavan, K. Wu and S. Kwak, (2024), A New Spectral Conjugate Subgradient Method with Application in Computed Tomography Image Reconstruction,

23. Jing Li, and Shujie Jing, (2021), An Improved HS Type Spectral Conjugate Gradient Method, an open-access article distributed under the terms of the Creative Commons Attribution Non-Commercial License, Pisco Med Publishing,10, 3.

24. LIU Pengjie, JIANG Xianzhen and SONG Dan (2022), A class of spectral conjugate gradient method

with sucient descent property, Operations Research Transactions, 12, 87-97.

25. Zyiad A. Hamad and Ghada M. Al-Naemi, 2024,” A New Effective Parameter for Unconstrained Optimization Using the Memoryless Quasi-Newton Method, to appear in AIP Conference 2024.

26. G. Zoutendijk. Nonlinear Programming Computational Methods. In: Abadie J. (Ed.) Integer and Nonlinear

Programming, North Holland. 37-86, 1970.

27. I. Bongartz, A. R. Conn, N. Gould, and P. L. Toint. ACM Transactions on Mathematical Software 21, 123–160, 1995.

28. N. Andrei. Advanced Modeling and Optimization, 10, 147–161, 2008.

29. E. P. Adorio and U. P. Diliman. Mvf-multivariate test functions library in C for unconstrained global

optimization, 2005.

30. Dolan, E. D., & Moré, J. J. (2002). Benchmarking optimization software with performance profiles. Mathematical programming, 91, 201-213.‏

31. Barzilai, J., & Borwein, J. M. (1988). Two-point step size gradient methods. IMA journal of numerical analysis, 8(1), 141-148.

Downloads

Published

2025-03-30

How to Cite

Adrees Hamad1, Z., & Moayid Al-Naemi , G. (2025). An Efficient Spectral Parameter for Accelerating the Conjugate Gradient Algorithm. Samarra Journal of Pure and Applied Science, 7(1), 241–264. https://doi.org/10.54153/sjpas.2025.v7i1.987

Similar Articles

1 2 3 4 5 6 7 8 9 10 > >> 

You may also start an advanced similarity search for this article.