Least-Squares Delayed Weighted Gradient Method

Authors

  • Rafael Aleixo Federal University of Santa Catarina
  • Hugo Lara Urdaneta Federal University of Santa Catarina

DOI:

https://doi.org/10.5540/03.2022.009.01.0267

Keywords:

Least squares, linear systems, iterative method, delayed weighted gradient method.

Abstract

The delayed weighted gradient algorithm (DWGM) is proved to be a robust iterative procedure to solve convex quadratic optimization problems.Its theoretical and numerical performance is similar to the conjugate gradient method.In this work we specialize the DWGM to deal with least-squares problems.Numerical experimentation is offered to show the effectiveness of the approach.

Downloads

Download data is not yet available.

References

H. Akaike. On a successiv transformation of probability distribution and its application to the analysis of the optimum gradient method. In: Annals of the Institute of Statistical Mathematics 11 (1959), pp. 116. doi: 10.1007/BF01831719.

R. Andreani and M. Raydan. Properties of the delayed weighted gradient method. In: Computational Optimization and Applications 78 (2021), pp. 167180. doi: 10.1007/ s10589-020-00232-9.

R. De Asmundis et al. An e cient gradient method using the Yuan steplength. In: Computational Optimization and Applications 59 (2014), pp. 541563. doi: 10.1007/s10589- 014-9669-5.

J. Barzilai and J. M. Borwein. Two-point step size gradient methods. In: IMA Journal of Numerical Analysis 8.1 (1988), pp. 141148. doi: 10.1093/imanum/8.1.141. 7

A. L. Cauchy. Méthode générale pour la résolution des systemes d'équations simultanées. In: Comptes Rendus Sci. Paris 25 (1847), pp. 536538.

Y. H. Dai. Alternate step gradient method . In: Optimization 52.4-5 (2003), pp. 395 415. doi: 10.1080/02331930310001611547.

Y. H. Dai, Y. Huang, and X. W. Liu. A family of spectral gradient methods for optimization. In: Computational Optimization and Applications 74 (2019), pp. 43-65. doi: 10.1007/ s10589-019-00107-8.

T. A. Davis and Y. Hu. The University of Florida Sparse Matrix Collection. In: ACM Trans. Math. Softw. 38.1 (2011), pp. 125. doi: 10.1145/2049662.2049663.

D. di Serano et al. On the steplength selection in gradient methods for unconstrained optimization. In: Applied Mathematics and Computation 318 (2018), pp. 176195. doi: https://doi.org/10.1016/j.amc.2017.07.037.

D. C. L. Fong and M. Saunders. LSMR: An Iterative Algorithm for Sparse Least-Squares Problems. In: SIAM Journal on Scientic Computing 33.5 (2011), pp. 29502971. doi: 10.1137/10079687X.

R. W. Freund, G. H. Golub, and N. M. Nachtigal. Iterative solution of linear systems. In: Acta Numerica 1 (1992), pp. 57100. doi: 10.1017/S0962492900002245.

A. Friedlander et al. Gradient method with retards and generalizations. In: SIAM Journal on Numerical Analysis 36.1 (1999), pp. 275289. doi: 10.1137/S003614299427315X.

S. P. Kolodziej et al. The SuiteSparse Matrix Collection website interface. In: Journal of Open Source Software 4.35 (2019), pp. 1244:11244:4. doi: 10.21105/joss.01244.

J. L. Lamotte, B. Molina, and M. Raydan. Smooth and adaptive gradient method with retards. In: Mathematical and Computer Modelling 36.9 (2002), pp. 11611168. doi: 10.1016/S0895-7177(02)00266-2.

H. Oviedo, R. Andreani, and M. Raydan. A family of optimal weighted conjugate-gradienttype methods for strictly convex quadratic minimization. In: Numerical Algorithms 90 (2022), pp. 12251252. doi: 10.1007/s11075-021-01228-0.

H. F. Oviedo-Leon. A delayed weighted gradient method for strictly convex quadratic minimization. In: Computational Optimization and Applications 74 (2019), pp. 729746. doi: 10.1007/s10589-019-00125-6.[17] C. C. Paige and M. A. Saunders. LSQR: An algorithm for sparse linear equations and sparse least squares. In: ACM Trans. Math. Softw. 8.1 (1982), pp. 4371. doi: 10.1145/ 355984.355989.

Y. X. Yuan. A new stepsize for the steepest descent method. In: Journal of Computational Mathematics 24.2 (2006), pp. 149156

Downloads

Published

2022-12-08

Issue

Section

Trabalhos Completos