Fast transformations and self-similar deep learning neural networks. Part 3. Pyramidal neural networks with a deep degree of learning

Alexander Yu. Dorogov

PJSC "Information Telecommunication Technologies" ("Inteltech"), St. Petersburg State Electrotechnical University

The paper considers a class of fast neural networks with a pyramidal structure. Methods of topological construction of one-dimensional and two-dimensional pyramidal networks are given. Networks of the class under consideration are representable by linear operators, have a self-similar structure, and are a special case of the fast Fourier transform algorithm. Topological models of pyramidal neural networks of direct and reverse orientation are proposed. The paper shows the use of pyramidal neural networks of fast learning for the implementation of correlative digital signal and image processing, combinational logic and memory elements. Examples of the construction of an encoder and a decoder of binary codes are considered. It is noted that the pyramidal memory network provides storage and accurate recovery of images similar to storing data in random access computer memory. It is proved that a fast pyramid network is a deep learning neural network, and a self-similar structure allows the network to be trained to new data without the need for a complete retraining of the network. This work is the third part of the generalizing article "Fast transformations and self-similar neural networks of deep learning". In the first part, stratified models of self-similar neural networks are considered, in the second part, algorithms for training fast neural networks and generalized spectral transformations are considered.

fast tunable transformation; neural network; pyramid structure; deep learning, code decoder, code encoder, neural network memory, neural network plasticity, degrees of freedom

Back