Fractional Gradient Descent Method for Spiking Neural Networks
- Resource Type
- Conference
- Authors
- Yang, Honggang; Chen, Jiejie; Jiang, Ping; Xu, Mengfei; Zhao, Haiming
- Source
- 2023 2nd Conference on Fully Actuated System Theory and Applications (CFASTA) Fully Actuated System Theory and Applications (CFASTA), 2023 2nd Conference on. :636-641 Jul, 2023
- Subject
- Aerospace
Robotics and Control Systems
Signal Processing and Analysis
Training
Backpropagation
Computational modeling
Neural networks
Stochastic processes
Robustness
Timing
Spiking Neural Networks
Stochastic Gradient Descent Method
Fractional Order Difference
- Language
A fractional-order gradient descent method applicable to spiking neural networks(SNNs) is proposed for the problem of difficulty in training SNNs using stochastic gradient descent method. The method is an improvement of the location of the gradient backpropagation calculation in the training of SNNs and the output form of the last layer in the structure, respectively, and the convergence is proved using the fractional-order difference expression under the Grünwald-Letnikov definition. The test results on both pure SNN and CNN2SNN models simultaneously show that the method can effectively reduce the number of sample cycle inputs and thus the training time without reducing the accuracy.