Fast transformations and self-similar deep learning neural networks Part 1. Stratified models of self-similar neural networks and fast transformations

Alexander Yu. Dorogov

PJSC "Information Telecommunication Technologies" ("Inteltech"), St. Petersburg State Electrotechnical University

The paper shows that the construction of fast transformations (similar to FFT) is based on self-similar structures that can equally be used to build fast neural networks (FNN). It is shown that the class of fast transformations is determined by system invariants of the morphological level and can be described as the morphogenesis of terminal projections of neural modules. Linguistic models are proposed to describe the morphology, structure and topology of regular self-made neural networks. The models are easily generalized to multidimensional variants of neural networks of this class. Due to their structure, FNN have special learning algorithms that are fundamentally different from the classic ErrorBackPropagation by the absence of a mechanism for error back propagation. The learning algorithms are based on the methods of multiplicative factorization of images and fast transformations proposed in the work. The developed algorithms are completed in a finite number of steps with guaranteed convergence. The consistent development of the concept of self-similarity leads to the development of methods for creating fast neural networks with a deep degree of learning. Self-similar neural networks have a unique opportunity to learn to new data without losing previously acquired knowledge. It is shown that FNN can be used to create high-speed random access image memory and complex combinational logic devices. The paper presents the results of the author's research on the following issues: biological prerequisites for self-similarity of neural networks; self-similar multilayer structures, morphogenesis, stratification of model representations; algorithms for fast transformations, fast neural networks, tuning methods; training of FNN to reference functions; plasticity of FNN; pyramidal neural networks of deep learning; multi-channel correlators; implementation of memory and combinational logic on pyramidal structures. The research results will be presented in three parts of the article.

self-similar structures, fast transformations, morphogenesis, fast neural networks, pyramidal structures, image memory, combinational logic

Back