Recurrent Neural Networks with Fractional Order Gradient Method
- Resource Type
- Conference
- Authors
- Yang, Honggang; Fan, Rui; Chen, Jiejie; Xu, Mengfei
- Source
- 2022 14th International Conference on Advanced Computational Intelligence (ICACI) Advanced Computational Intelligence (ICACI), 2022 14th International Conference on. :49-55 Jul, 2022
- Subject
- Computing and Processing
Training
Gradient methods
Recurrent neural networks
Sensitivity
Computational modeling
Stochastic processes
Classification algorithms
stochastic gradient descent
recurrent neural network
fractional calculus
fractional difference
- Language
In view of the possibility that Recurrent Neural Network(RNN)’s stochastic gradient descent method will converge to the local optimum problem, two fractional stochastic gradient descent methods are proposed in this paper. The methods respectively use the fractional order substitution derivative part defined by Caputo and the fractional order substitution difference form defined by Riemann Liouville to improve the updating method of network parameters. Combining with the gradient descent characteristics, the influence of fractional order on the training results is discussed, and two adaptive order adjustment methods are proposed. Experiments on MNIST and FashionMNIST datasets show that the fractional stochastic gradient optimization algorithm can improve the classification accuracy and training speed of recurrent neural network.