Association Journal of CSIAM
Supervised by Ministry of Education of PRC
Sponsored by Xi'an Jiaotong University
ISSN 1005-3085  CN 61-1269/O1

Chinese Journal of Engineering Mathematics ›› 2023, Vol. 40 ›› Issue (5): 833-842.doi: 10.3969/j.issn.1005-3085.2023.05.011

Previous Articles     Next Articles

R-linear Convergence of Importance Sampling Stochastic Gradient Decent Algorithm

WANG Fusheng,  ZHEN Na,  LI Xiaotong   

  1. School of Mathematics and Statistics, Taiyuan Normal University, Jinzhong 030619
  • Received:2022-08-04 Accepted:2023-02-06 Online:2023-10-15 Published:2023-12-15
  • Supported by:
    The Basic Research Plan (Free Exploration) of Shanxi Province (202103021224303); the Scientific Research Project for Returned Overseas Chinese in Shanxi Province (2017-104).

Abstract:

In this paper, we propose a new stochastic variance reduction gradient algorithm for minimizing the sum of a finite number of smooth convex functions in machine learning. The characteristic of the new algorithm is to combine the gradient algorithm of stochastic variance reduction and a method of spectral gradient BB step size, so that the advantages of the two methods can be fully utilized. In addition, the new algorithm uses the important sample sampling method, which can greatly reduce the computation workload. In the end, it is proved that the new algorithm has R-linear convergence rate under the usual assumptions, and the complexity analysis is given. Numerical experiments show that the new algorithm is feasible and effective.

Key words: machine learning, stochastic gradient decent, importance sampling, linear convergence, BB step size

CLC Number: