Weight regularization in spiking neural networks

Dmitry I. Antonov, Sergey V. Sukhov

Ulyanovsk state technical university, Ulyanovsk branch of Kotelnikov institute of radioengineering and electronics RAS

Overfitting an artificial neural network model is the result of training taking into account both essential and insignificant features, noise. Regularization methods are intended to minimize the influence of random noise and to identify regular features. There are a number of regularization methods for 2nd generation artificial neural networks (dropout, L1-regularization, L2-regularization, etc.). But these conventional regularization methods are not suitable for the 3rd generation of neural networks, spiking neural networks (SNN), which provide more energy-efficient and biologically plausible computations. Information in SNN is transmitted using short pulses (spikes), and training occurs locally. The biological concept of brain neurons "use it or lose it" is that if a synaptic connection is not used, it weakens and disappears. The application of the biological concept to the SNN consists in imparting a temporal dependence to the synaptic weights of the network, which reduces the weight value proportionally to the "silence" time of the synaptic connection. In this paper, a new method of weight regularization for SNN is proposed, based on the pruning of unused weights during the network training, which occurs due to the weights receiving a dependence on the time elapsed since the spike. In the experiments, a two-layer SNN was used, trained according to a combined Hebbian rule, previously developed by the authors on the basis of local learning rules STDP (spike-timing-dependent plasticity) and all-LTD (all-long-term-depression rule). For training and testing SNN, the MNIST dataset (images of handwritten digits) was used: 15,000 images for training and 1,500 images for testing, only 3 classes of images out of 10 possible were used in the experiments.

spiking neural network, overfitting, regularization method

Back