Association Journal of CSIAM
Supervised by Ministry of Education of PRC
Sponsored by Xi'an Jiaotong University
ISSN 1005-3085  CN 61-1269/O1

Chinese Journal of Engineering Mathematics ›› 2022, Vol. 39 ›› Issue (5): 681-694.doi: 10.3969/j.issn.1005-3085.2022.05.001

    Next Articles

A Krylov Subspace Optimization Method in Artificial Neural Network

ZHANG Zhenyu1,2,   LIN Muyang1   

  1. 1. School of Mathematics, Shanghai University of Finance and Economics, Shanghai 200433;
    2. Shanghai University of Finance and Economics Zhejiang College, Jinhua 321013
  • Online:2022-10-15 Published:2022-12-15
  • Supported by:
    The National Natural Science Foundation of China (11671246).

Abstract:

The development of algorithms for optimizing the loss function of artificial neural networks is introduced is this work. The KSD (Krylov Subspace Descent) algorithm is extended to MKSD (Modified KSD) algorithm which has adaptively variable subspace dimension instead of fixed dimension. Some numerical examples of optimizing the fully connected neural network problems by MKSD, KSD and SGD (Stochastic Gradient Descent) algorithms are given. The numerical results show that the MKSD method has certain advantages over other methods.

Key words: artificial neural network, Krylov subspace, optimization method

CLC Number: