Improved Stochastic Gradient Descent Algorithm with Mean-Gradient Adaptive Stepsize for Solving Large-Scale Optimization Problems

  • Wah June Leong Department of Mathematics and Statistics, Universiti Putra Malaysia, 43400 UPM Serdang, Selangor
  • Munierah Zulkifli Department of Mathematics and Statistics, Universiti Putra Malaysia, 43400 UPM Serdang, Selangor
  • Nor Aliza Abd Rahmin Department of Mathematics and Statistics, Universiti Putra Malaysia, 43400 UPM Serdang, Selangor

Abstract

Stochastic gradient descent (SGD) is one of the most common algorithms used in solving large unconstrained optimization problems. It utilizes the concept of classical gradient descent method with modification on the gradient selection. SGD uses random or batch data sets to compute gradient in solving optimization problems. It is an iterative algorithm with descent properties that reduces computational cost by using derivatives of random data points. This paper proposes a new SGD algorithm with modified stepsize that employs function scaling strategy. Particularly, the stepsize parameter is coupled with function scaling by storing the mean of gradients in the denominator. The performance of the method is evaluated based on the ability to reduce function value after each iteration, ability to attain the lowest function value when applied to solve the well-known zebra-strip problem. Our results indicate that the proposed method performed favourable to the existing method.

References

Barani, F., Savadi, A., & Yazdi, H. (2021). Convergence behavior of diffusion stochastic gradient descent algorithm. Signal Processing, 183, 108014.
Duchi, J., Hazan, E., & Singer, Y. (2011). Adaptive Subgradient Methods for Online Learning and Stochastic Optimization. Journal of Machine Learning Research, 12, 2121-2159.
Gonzaga, C., and Scheneider, R. (2016). On the steepest descent algorithms for quadratic functions. Computational Optimum Application, 523-542.
Hao, W. (2021). A gradient descent method for solving a system of nonlinear equations. Applied Mathematics Letters, 106739.
Published
2023-11-19
How to Cite
LEONG, Wah June; ZULKIFLI, Munierah; ABD RAHMIN, Nor Aliza. Improved Stochastic Gradient Descent Algorithm with Mean-Gradient Adaptive Stepsize for Solving Large-Scale Optimization Problems. Menemui Matematik (Discovering Mathematics), [S.l.], v. 45, n. 2, p. 224-230, nov. 2023. ISSN 0126-9003. Available at: <https://myjms.mohe.gov.my/index.php/dismath/article/view/24687>. Date accessed: 26 july 2024.

Most read articles by the same author(s)

Obs.: This plugin requires at least one statistics/report plugin to be enabled. If your statistics plugins provide more than one metric then please also select a main metric on the admin's site settings page and/or on the journal manager's settings pages.