Cybernetics and Computer Engineering, 2021, 4(206)
MOROZ O.H., PhD (Engineering),
Senior Researcher of Dept. for Information Technologies of Inductive Modeling
ORCID: 0000-0002-0356-8780, e-mail: оlhahryhmoroz@gmail.com
STEPASHKO V.S., DSc (Engineering), Professor,
Head of Dept. for Information Technologies of Inductive Modeling
ORCID: 0000-0001-7882-3208, e-mail: firstname.lastname@example.org
International Research and Training Centre for Information Technologies
and Systems of the National Academy of Sciences of Ukraine and Ministry
of Education and Science of Ukraine,
40, Acad. Glushkov av., Kyiv, 03187, Ukraine
COMPARATIVE FEATURES OF MIA GMDH AND DEEP FEED-FORWARD NEURAL NETWORKS
Introduction. Deep neural networks are effective tools for solving actual tasks such as data mining, modeling, forecasting, pattern recognition, clustering, classification etc. They differ with respect to the architecture design, learning methods and so on. Most simple and widely used are deep feed-forward supervised NNs.
The purpose of the paper is to compare briefly main features of the deep feed-forward deterministic supervised networks with the Multilayered Iterative Algorithm of GMDH (MIA GMDH) and to formulate main ideas of constructing a new class of hybrid deep networks based on the MIA neural network.
Methods. Most usable deep feed-forward supervised neural networks have been studied: multilayered perceptron, convolutional NN and some its modifications, polynomial neural networks, genetic polynomial neural network etc.
Results. There was carried out a comparative analysis of main features of the MIA GMDH neural network with the characteristics of other deep deterministic supervised neural networks. The most promising approaches are identified to improve the performance of this network, particularly by hybridization with methods of computational intelligence. The main idea of building a new class of hybrid deep networks based on MIA GMDH is formulated.
Conclusions. MIA GMDH and its modifications are original representatives of the self-organizing networks potentially giving best results, especially for big data case. Hybridization of GMDH-based NNs with stochastic methods of computational intelligence is suggested to achieve a synergetic effect.
Keywords: multilayered iterative algorithm of GMDH (MIA GMDH), self-organizing neural network, neural network architecture, deep neural networks, feed-forward neural networks, supervised neural networks, deep learning.
1 J. Schmidhuber. Deep Learning in Neural Networks: An Overview. Neural Networks. 2015, Vol. 61, pp. 85-117.
2 Y. Bengio. Learning Deep Architectures for AI. Foundations and Trends in Machine Learning. 2009, Vol. 2: No. 1, pp. 1-127.
3 Li Deng. A tutorial survey of architectures, algorithms, and applications for deep learning. APSIPA Transactions on Signal and Information Processing, Volume 3, E2, 2014, pp.1-29. doi:10.1017/atsip.2013.9
4 Li Deng, Dong Yu. Deep Learning: Methods and Applications. Foundations and Trends in Signal Processing. 2014, Vol. 7: No. 3-4, pp. 197-387.
5 A. Shahroudnejad. A Survey on Understanding, Visualizations, and Explanation of Deep Neural Networks. CoRR, 2021, URL: https://arxiv.org/pdf/2102.01792v1.pdf (Last accesed: 10.08.2021)
6 D. Shaveta, K. Munish, A. Maruthi Rohit, K. Gulshan. A Survey of Deep Learning and Its Applications: A New Paradigm to Machine Learning. Archives of Computational Methods in Engineering. 2020, 27 (4), pp. 1071-1092.
7 URL: https://www.mdpi.com/2504-3900/47/1/9 (Last accesed: 15.08.2021)
8 Ivakhnenko A.G., Lapa V.G. Cybernetic Predicting Devices. CCM Information Corporationm, 1965, 256 p.
9 B. Ben-Bright, Y. Zhan, B. Ghansah, R. Amankwah, D. Keddy Wornyo, E. Ansah. Taxonomy and a Theoretical Model for Feedforward Neural Networks. International Journal of Computer Applications. 2017, 163(4), pp. 39-49.
10 J.J. Hopfield. Neural Networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA, 1982, 79, pp. 2554-2558.
11 Z. Li, W. Yang, S. Peng, F. Liu. A Survey of Convolutional Neural Networks. Analysis, Applications, and Prospects. 2020, pp.21
12 URL: https://fanchenyou.github.io/homepage/docs/cnn_survey.pdf (Last accessed: 10.07.2021)
13 S. Srinivas et al. A Taxonomy of Deep Convolutional Neural Nets for Computer Vision. Frontiers in Robotics and AI. 2016, pp.?
14 G. Chrysos, S. Moschoglou, G. Bouritsas, J. Deng, Y. Panagakis, S. Zafeiriou. Deep Polynomial Neural Networks. IEEE Transactions on Pattern Analysis and Machine Intelligence (T-PAMI), 2021.
15 URL: https://towardsdatascience.com/an-introduction-to-convolutional-neural-networks-eb0b60b58fd7 (Last accessed: 17.07.2021)
16 Stepashko V. Developments and Prospects of GMDH-Based Inductive Modeling. Advances in Intelligent Systems and Computing II: Selected Papers from the International Conference on Computer Science and Information Technologies CSIT 2017 / N. Shakhovska, V. Stepashko, Editors. AISC book series, Springer, 2018, Vol. 689, pp. 474-491.
17 Stepashko V. On the Self-Organizing Induction-Based Intelligent Modeling. Advances in Intelligent Systems and Computing III: Selected Papers from the International Conference on Computer Science and Information Technologies CSIT 2018 / N. Shakhovska, M.O. Medykovskyy, Editors. AISC book series, Springer, 2019, Vol. 871, pp. 433-448.
18 S.-K. Oh, W. Pedrycz, B.-J. Park. Polynomial neural networks architecture: analysis and design. Computers and Electrical Engineering. 2003, 23, pp. 703-725.
19 S. Farzi. A New Approach to Polynomial Neural Networks based on Genetic Algorithm. International Scholarly and Scientific Research & Innovation. 2008, 2(8), pp. 2700-2707
20 Moroz O.H., Stepashko V.S. An overview of hybrid structures of GMDH-like neural networks and genetic algorithms. Inductive modeling of complex systems: Coll. sciences works. 2015,7, K .: IRTC ITS NASU, pp. 173-191.