Fast transformations and self-similar deep learning neural networks. Part 2. Methods of training fast neural networks

Alexander Yu. Dorogov

PJSC “Information Telecommunication Technologies” (“Inteltech”), St. Petersburg State Electrotechnical University

It is noted in the paper that fast neural networks (FNN) are self-similar to the fast Fourier transform (FFT) algorithm. A method for constructing a matrix form of a fast transformation algorithm is presented. The factorizability of the elements of the FNN matrix by the elements of neural cores is proved. A method of multiplicative factorization of arbitrary one-dimensional images is proposed. It is shown that due to their structure, fast neural networks have special learning algorithms that are fundamentally different from the classic ErrorBackPropagation by the absence of a mechanism for error back propagation. The considered algorithms for teaching FNN are based on the methods of multiplicative factorization of images and fast transformations proposed in the work. Examples of network tuning to the orthogonal Hadamard basis, Fourier basis, as well as the FNN implementation of the Cantor and Sierpinski quasifractals are shown. A method of tuning fast transformations to a reference function based on the method of fractal filtering of signals is described. A method for tuning orthogonal adapted transformations is proposed. Examples are given. This work is the second part of the generalizing article "Fast transformations and self-similar neural networks of deep learning". In the first part, stratified models of self-similar neural networks are considered.

fast transformations, fast neural networks, topological matrices, orthogonality, adapted transformations

Back