Issue 2 (208), article 1


Cybernetics and Computer Engineering, 2022, 2(208)

RACHKOVSKIJ D.A.1,2, DSc (Engineering),
Chief Researcher, Dept. of Neural Information Processing Technologies,
Visiting Professor, Dept. of Computer Science, Electrical and Space Engineering,

GRITSENKO V.I.1, Corresponding Member of NAS of Ukraine,
 Directorate Adviser,
ORCID ID 0000-0002-6250-3987

VOLKOV O.E.1, PhD (Engineering), 

GOLTSEV A.D.1, PhD (Engineering), Senior Researcher,
Acting Head of the Dept. of Neural Information Processing Technologies,

REVUNOVA E.G.1, DSc (Engineering),
Senior Researcher, Dept. of Neural Information Processing Technologies,

KLEYKO D.3, PhD (Computer Science), Researcher,

LUKOVICH V.V.1 Researcher of the Dept. of Neural Information Processing Technologies
ORCID ID 0000-0002-3848-4712

OSIPOV E.2, PhD (Computer Science), Professor,

1 International Research and Training Center for Information Technologies and Systems of the NAS of Ukraine and of the MES of Ukraine, 40, Acad. Glushkova av., Kyiv, 03680, Ukraine

2 Department of Computer Science, Electrical and Space Engineering, Lulea University of Technology, 971 87 Lulea, Sweden

3 RISE Research Institutes of Sweden AB, 164 40 Kista, Sweden


Introduction.Current progress in the field of specialized Artificial Intelligence is associated with the use of Deep Neural Networks. However, they have a number of disadvantages: the need for huge data sets for learning, the complexity of learning procedures, excessive specialization for the training set, instability to adversarial attacks, lack of integration with knowledge of the world, problems of operating with structures known as binding or composition problem. Overcoming these shortcomings is a necessary condition for advancing from specialized Artificial Intelligence to general one, which requires the development of alternative approaches.

The purpose of the paper is to present an overview of research in this direction, which has been carried out at the International Center for 25 years. The approach being developed stems from the ideas of N. M. Amosov and his scientific school. Connections to the Hyperdimensional Computing (HDC) and Vector Symbolic Architectures (VSA) field as well as to current brain research are also provided.

Results. The concept of distributed data representation is outlined, including HDC/VSA that are capable of representing various data structures. The developed paradigm of Associative-Projective Neural Networks is considered: codevector representation of data, superposition and binding operations, general architecture, transformation of data of various types into codevectors, methods for solving problems and applications.

Conclusion. An adequate representation of data is one of the key issues within the Artificial Intelligence. The main area of research reviewed in this article is the problem of representing heterogeneous data in Artificial Intelligence systems in a unified format based on modeling the neural organization of the brain and the mechanisms of thinking. The approach under development is based on the hypothesis of distributed representation of information in the brain and allows representing various types of data, from numeric values to graphs, as vectors of large but fixed dimensionality.

The most important advantages of the developed approach are the possibility of natural integration and efficient processing of various types of data and knowledge, a high degree of parallel computing, reliability and resistance to noise, the possibility of hardware implementation with high performance and energy efficiency, data processing based on associative similarity search — similar to how human memory works. This allows one to unify the methods, algorithms, and software and hardware for Artificial Intelligence systems, increase their scalability in terms of speed and memory with an increase in data volume and complexity.

The research creates the basis for overcoming the shortcomings of current approaches to the specialized Artificial Intelligence based on Deep Neural Networks and paves the way for the creation of Artificial General Intelligence.

Keywords: distributed data representation, associative-projective neural networks, codevectors, hyperdimensional computing, vector symbolic architectures, artificial intelligence.

Download full text!

1. Amosov N.M. Modelling of thinking and the mind. New York: Spartan Books, 1967, 192 p.

2. Amosov N.M., Kasatkin A.M., Kasatkina L.M., Kussul E.M., Talaev S.A. Intelligent behaviour systems based on semantic networks. Kybernetes. 1973, Vol. 2, N 4, pp. 211-216.

3. Amosov N.M., Kussul E.M., Fomenko V.D. Transport robot with a neural network control system. Advance papers of 4th Intern. Joint Conf. on Artif. Intelligence. 1975, Vol 9, pp. 1-10.

4. Amosov N.M. Algorithms of the Mind. Kiev: Naukova Dumka, 1979, 223 p. (in Russian)

5. Amosov N.M., Baidyk T.N., Goltsev A.D., Kasatkin A.M., Kasatkina L.M., Rachkovskij D.A. Neurocomputers and intelligent robots. Kiev: Naukova Dumka. 1991, 269 p. (in Russian)

6. Kussul E.M. Associative neuron-like structures. Kiev: Naukova Dumka. 1992, 144 p.
(in Russian)

7. Kleyko D., Rachkovskij D.A., Osipov E., Rahimi A. A survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part I: Models and data transformations. ACM Computing Surveys. 2022.

8. Kleyko D., Rachkovskij D.A., Osipov E., Rahimi A. A survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part II: Applications, Cognitive Models, and Challenges. Accepted, ACM Computing Surveys. 2022. Available online: arXiv:2112.15424.

9. Thorpe S. Localized versus distributed representations The Handbook of Brain Theory and Neural Networks. Edited by M.A. Arbib.Cambridge, MA: The MIT Press. 2003, pp. 643-646.

10. Rachkovskij D.A., Kussul E.M. Binding and normalization of binary sparse distributed representations by context-dependent thinning. Neural Computation. 2001, Vol. 13, N 2, pp. 411-452.

11. Plate T. Holographic Reduced Representation: Distributed Representation for Cognitive Structures. Stanford: CSLI Publications, 2003, 300 p.

12. Kanerva P. Hyperdimensional computing: an introduction to computing in distributed representation with high-dimensional random vectors. Cognitive Computation. 2009, Vol. 1, N2, pp. 139-159.

13. Gayler R.W. Multiplicative binding, representation operators, and analogy. Advances in Analogy Research: Integration of Theory and Data from the Cognitive, Computational, and Neural Sciences. Sofia, Bulgaria: New Bulgarian University, 1998, p. 405.

14. Kussul E.M., Rachkovskij D.A., Baidyk T.N. Associative-Projective Neural Networks: Architecture, Implementation, Applications. Proc. Neuro-Nimes’91. 1991, pp. 463-476.

15. Kussul E.M., Rachkovskij D.A., Baidyk T.N. On image texture recognition by associative-projective neurocomputer. Proc. ANNIE’91 Conference “Intelligent engineering systems through artificial neural networks”. St. Louis, MO: ASME Press, 1991, pp. 453-458.

16. Kussul E.M., Rachkovskij D.A. Multilevel assembly neural architecture and processing of sequences. In Neurocomputers and Attention: Connectionism and Neurocomputers, vol. 2. Manchester and New York: Manchester University Press, 1991, pp. 577-590.

17. Gritsenko V.I., Rachkovskij D.A., Goltsev A.D., Lukovych V.V., Misuno I.S., Revunova E.G., Slipchenko S.V., Sokolov A.M., Talayev S.A. Neural distributed representation for intelligent information technologies and modeling of thinking. Cybernetics and Computer Engineering. 2013, Vol. 173, pp. 7-24. (in Russian).

18. Rachkovskij D.A., Kussul E.M., Baidyk T.N. Building a world model with structure-sensitive sparse binary distributed representations. Biologically Inspired Cognitive Architectures. 2013, Vol. 3, pp. 64-86.

19. Gritsenko V.I., Rachkovskij D.A., Frolov A.A., Gayler R., Kleyko D., Osipov E. Neural distributed autoassociative memories: A survey. Cybernetics and Computer Engineering. 2017, N 2 (188), pp. 5-35

20. Frolov A.A., Rachkovskij D.A., Husek D. On information characteristics of Willshaw-like auto-associative memory. Neural Network World. 2002. Vol. 12, No 2, pp. 141-158.

21. Frolov A.A., Husek D., Rachkovskij D.A. Time of searching for similar binary vectors in associative memory. Cybernetics and Systems Analysis. 2006, Vol. 42, N 5, pp. 615-623.

22. Fusi S. Memory capacity of neural network models. arXiv:2108.07839. 21 Dec 2021.

23. Steinberg J., Sompolinsky H. Associative memory of structured knowledge. 2022.

24. Liang J.C., Erez J., Zhang F., Cusack R., Barense M.D. Experience transforms conjunctive object representations: Neural evidence for unitization after visual expertise. Cerebral Cortex. 2020. Vol. 30, N 5, pp. 2721-2739.

25. Li A.Y., Fukuda K., Barense M.D. Independent features form integrated objects: Using a novel shape-color “conjunction task” to reconstruct memory resolution for multiple object features simultaneously. Cognition. 2022, Vol. 223, Article 105024.

26. Andermane N., Joensen B.H., Horner A.J. Forgetting across a hierarchy of episodic representations. Current Opinion in Neurobiology. 2021, Vol. 67, pp. 50-57.

27. Michelmann S., Hasson U., Norman K.A. Event boundaries are steppingstones for memory retrieval. 2021. Preprint DOI 10.31234/

28. Schneegans S., McMaster J.M.V., Bays P.M. Role of time in binding features in visual working memory. Psychological Review. 2022.

29. Geerligs L., van Gerven M., Campbell K.L., Güçlü U. A nested cortical hierarchy of neural states underlies event segmentation in the human brain. Neuroscience. 2021.

30. Peer M., Brunec I.K., Newcombe N.S., Epstein R.A. Structuring knowledge with cognitive maps and cognitive graphs. Trends in Cognitive Sciences. 2021, Vol. 25, N 1, pp. 37-54.

31. Rachkovskij D.A. Shift-equivariant similarity-preserving hypervector representations of sequences. arXiv:2112.15475. 2021.

32. Rachkovskij D.A. Representation and processing of structures with binary sparse distributed codes. IEEE Trans. Knowledge Data Engineering. 2001, Vol. 13, N 2, pp. 261-276.

33. Rachkovskij D.A., Slipchenko S.V. Similarity-based retrieval with structure-sensitive sparse binary distributed representations. Comput. Intelligence. 2012,Vol. 28, N. 1, pp. 106-129.

34. Papadimitriou C.H., Vempala S.S., Mitropolsky D., Collins M.J., Maass W. Brain computation by assemblies of neurons. Proceedings of the National Academy of Sciences. 2020, Vol. 117, N 25, pp. 14464-14472.

35. Müller M.G., Papadimitriou C.H., Maass W., Legenstein R. A model for structured information representation in neural networks of the brain. eNeuro. 2020, Vol. 7, N 3, pp. 1-17.

36. Mitropolsky D., Collins M.J., Papadimitriou C.H. A biologically plausible parser. Transactions of the Association for Computational Linguistics. 2021, Vol. 9, pp. 1374-1388.

37. Papadimitriou C.H., Friederici A.D. Bridging the gap between neurons and cognition through assemblies of neurons. Neural Computation. 2022, Vol. 34, N 2, pp. 291-306.

38. Rachkovskij D.A., Misuno I.S., Slipchenko S.V. Randomized projective methods for the construction of binary sparse vector representations. Cybernetics and Systems Analysis. 2012, Vol 48, N 1, pp. 146-156.

39. Rachkovskij D.A. Vector data transformation using random binary matrices. Cybernetics and Systems Analysis. 2014, Vol. 50, N 6, pp. 960-968.

40. Rachkovskij D.A. Formation of similarity-reflecting binary vectors with random binary projections. Cybernetics and Systems Analysis. 2015, Vol. 51, N 2, pp. 313-323.

41. Rachkovskij D.A. Estimation of vectors similarity by their randomized binary projections. Cybernetics and Systems Analysis. 2015, Vol. 51, N 5, pp. 808-818.

42. Dasgupta S., Stevens C.F., Navlakha S. A neural algorithm for a fundamental computing problem. Science. 2017, Vol. 358, pp. 793-796.

43. Rachkovskij D.A., Gritsenko V.I. Distributed Representation of Vector Data Based on Random Projections. Kyiv: Interservice, 2018. ISBN: 978-617-696-837-5. (in Ukrainian)

44. Rachkovskij D.A. Codevectors: Sparse Binary Distributed Representations of Numerical Data. Kiev: Interservice, 2019. ISBN: 978-617-696-987-7. (in Russian)

45. Gritsenko V.I., Rachkovskij D.A. Methods for Vector Representation of Objects for Fast Similarity Estimation. Kyiv: Naukova Dumka, 2019. ISBN: 978-966-00-1741-2. (In Russian)

46. Rachkovskij D. A. Introduction to fast similarity search. Kyiv: Interservice, 2019, 294 p. ISBN: 978-617-696-904-4. (in Russian)

47. Ghazi B., Panigrahy R., Wang J. Recursive sketches for modular deep learning. Proceedings of the 36th International Conference on Machine Learning. PMLR. 2019, Vol. 97, pp. 2211-2220.

48. Panigrahy R., Wang X., Zaheer M. Sketch based memory for neural networks. Proceedings of the 24th International Conference on Artificial Intelligence and Statistics (AISTATS’21). 2021, San Diego, California, USA. PMLR. 2021, Vol. 130, pp. 3169-3177.

49. Hiratani N., Sompolinsky H. Optimal quadratic binding for relational reasoning in vector symbolic neural architectures. arXiv:2204.07186. 2022, pp. 1-32.

50. Chaitanya R., Hopfield J., Grinberg L., Krotov D. Bio-inspired hashing for unsupervised similarity search. Proceedings of the 37th International Conference on Machine Learning. PMLR. 2020, Vol. 119, pp. 8295-8306.

51. Liang Y., Ryali C., Hoover B., Grinberg L., Navlakha S., Zaki M., Krotov D. Can a fruit fly learn word embeddings? Proc. ICLR’21. 2021.

52. Li W. Modeling winner-take-all competition in sparse binary projections. In: Machine Learning and Knowledge Discovery in Databases. Edited by: F. Hutter, K. Kersting,
J. Lijffijt, I. Valera. Lecture Notes in Computer Science. Vol. 12457. Cham: Springer, 2021, pp. 456-472.

53. Li W.Y., Zhang S.Z. Binary random projections with controllable sparsity patterns. Journal of the Operations Research Society of China. 2022.

54. Rachkovskij D.A., Slipchenko S.V., Misuno I.S., Kussul E.M., Baidyk T.N. Sparse binary distributed encoding of numeric vectors. Journal of Automation and Information Sciences. 2005, Vol. 37, N 11, pp. 47-61.

55. Izawa S., Kitai K., Tanaka S., Tamura R., Tsuda K. Continuous black-box optimization with quantum annealing and random subspace coding. Proc. Adiabatic Quantum Computing (AQC’21), June 22-25, 2021.

56. Izawa S., Kitai K., Tanaka S., Tamura R., Tsuda K. Continuous black-box optimization with an Ising machine and random subspace coding. Phys. Rev. Research. 2022, Vol. 4, N 2. Article 023062.

57. Barclay I. et al. Trustable service discovery for highly dynamic decentralized workflows. Future Generation Computer Systems. 2022. DOI: 10.1016/j.future.2022.03.035

58. Wei Y., Xie P., Zhang L. Tikhonov regularization and randomized GSVD. SIAM J.
Matrix Analysis Appl. 2016, Vol. 37, pp. 649-675.

59. Zhang L., Wei Y. Randomized core reduction for discrete ill-posed problem. J. Comput. Appl. Math. 2020, Vol. 375, Article 112797.

60. Wei W., Zhang H., Yang X., Chen X. Randomized generalized singular value decomposition. Commun. Appl. Math. Comput. 2021, Vol 3, pp. 137-156.

61. Zuo Q., Wei Y., Xiang H. Quantum-inspired algorithm for truncated total least squares solution. arXiv:2205.00455. 2022

62. Revunova E.G., Rachkovskij D.A. Using randomized algorithms for solving discrete ill-posed problems. J. Information Theories and Applications. 2009, Vol. 16, N 2, pp. 176-192.

63. Rachkovskij D.A., Revunova E.G. Randomized method for solving discrete ill-posed problems. Cybernetics and Systems Analysis. 2012, Vol. 48, N 4, pp. 621-635.

64. Revunova E.G. Analytical study of the error components for the solution of discrete
ill-posed problems using random projections. Cybernetics and Systems Analysis. 2015, Vol. 51, N 6, pp. 978-991.

65. Revunova E.G. Model selection criteria for a linear model to solve discrete ill-posed problems on the basis of singular decomposition and random projection. Cybernetics and Systems Analysis. 2016, Vol. 52, N 4, pp. 647-664.

66. Revunova E.G. Increasing the accuracy of solving discrete ill-posed problems by the random projection method. Cybernetics and Systems Analysis. 2018, Vol. 54, N 5, pp. 842-852.

67. Revunova O.G., Tyshchuk O.V., Desiateryk О.О. On the generalization of the random projection method for problems of the recovery of object signal described by models of convolution type. Control Systems and Computers. 2021, N 5-6, pp. 25-34.

68. Sokolov A., Rachkovskij D. Approaches to sequence similarity representation. Int. J. Inf. Theor. Appl. 2006, Vol. 13, N 3, pp. 272-278.

69. Sokolov A. Vector representations for efficient comparison and search for similar strings. Cybernetics and Systems Analysis. 2007, Vol. 43, N 4, pp. 484-498.

70. Rachkovskij D.A. Development and investigation of multilevel assembly neural networks. PhD thesis. Kiev, Ukrainian SSR: V.M. Glushkov Institute of Cybernetics, 1990. (in Russian)

71. Kussul E.M., Baidyk T.N. On information encoding in Associative-Projective Neural Networks. Technical Report 93-3. V.M. Glushkov Institute of Cybernetics, 1993. (in Russian)

72. Kleyko D., Osipov E., De Silva D., Wiklund U., Vyatkin V., Alahakoon D. Distributed representation of n-gram statistics for boosting self-organizing maps with hyperdimensional computing. Proc. PSI’19. 2019, pp. 64-79.

73. Kussul E.M., Baidyk T.N., Wunsch D.C., Makeyev O., Martin A. Permutation coding technique for image recognition system,” IEEE Trans. Neural Networks. 2006, Vol. 17, N 6, pp. 1566-1579.

74. Rachkovskij D.A. Application of stochastic assembly neural networks in the problem
of interesting text selection. Neural network systems for information processing. 1996, pp. 52-64. (in Russian)

75. Rachkovskij D.A., Kleyko D. Recursive binding for similarity-preserving hypervector representations of sequences. Proc. IJCNN’22. 2022.

76. Kussul E., Baidyk T. Improved method of handwritten digit recognition tested on MNIST database. Image and Vision Computing. 2004, Vol. 22, pp. 971-981.

77. Makeyev O., Sazonov E., Baidyk T., Martin A. Limited receptive area neural classifier for texture recognition of mechanically treated metal surfaces. Neurocomputing. 2008, Vol. 71, N 7-9, pp. 1413-1421.

78. Rachkovskij D.A. Distance-based index structures for fast similarity search. Cybernetics and Systems Analysis. 2017, Vol. 53, N 4, pp. 636-658.

79. Rachkovskij D.A. Index structures for fast similarity search for binary vectors. Cybernetics and Systems Analysis. 2017, Vol. 53, N 5, pp. 799-820.

80. Kussul E.M., Baidyk T.N., Lukovich V.V., Rachkovskij D.A. Adaptive neural network classifier with multifloat input coding. Proc. NeuroNimes’93, Nimes, France, Oct. 25-29, 1993, pp. 209-216.

81. Lukovich V.V., Goltsev A.D., Rachkovskij D.A. Neural network classifiers for micromechanical equipment diagnostics and micromechanical product quality inspection. Proc. EUFIT’97, 1997, pp. 534-536.

82. Rachkovskij D.A., Kussul E.M. DataGen: a generator of datasets for evaluation of classification algorithms. Pattern Recognition Letters. 1998, Vol.19, N 7, pp. 537-544.

83. Goltsev A.D. Neural networks with assembly organization. Kyiv: Naukova Dumka. 2005, 200 p. (in Russian)

84. Rachkovskij D. Linear classifiers based on binary distributed representations. J. Inf. Theories Appl. 2007, Vol. 14, N 3, pp. 270-274.

85. Goltsev A., Rachkovskij D. A recurrent neural network for partitioning of hand drawn characters into strokes of different orientations. International Journal of Neural Systems. 2001, Vol. 11, pp. 463-475.

86. Goltsev A., Gritsenko V., Húsek D. Segmentation of visual images by sequential extracting homogeneous texture areas. Journal of Signal and Information Processing. 2020, Vol. 11, N 4, pp. 75-102.

87. Goltsev A. D., Gritsenko V.I. Algorithm of sequential finding the textural features characterizing homogeneous texture segments for the image segmentation task. Cybernetics and Computer Engineering. 2013, N 173, pp. 25-34.

88. Goltsev A., Gritsenko V., Kussul E., Baidyk T. Finding the texture features characterizing the most homogeneous texture segment in the image. Proc. IWANN’15. 2015, pp. 287-300.

89. Volkov O., Komar M., Volosheniuk D. Devising an image processing method for transport infrastructure monitoring systems. Eastern-European Journal of Enterprise Technologies. 2021, Vol. 4, N 2 (112), pp. 18-25.

90. Gritsenko V., Volkov O., Komar M., Volosheniuk D. Integral adaptive autopilot for an unmanned aerial vehicle. Aviation. 2018, Vol. 22, N 4, pp. 129-135. aviation.2018.6413

91. Misuno I. S., Rachkovskij D. A., Slipchenko S. V., Sokolov A. M. Searching for text information with the help of vector representations. Probl. Programming. 2005, N 4, pp. 50-59.

92. Goltsev A., Rachkovskij D. Combination of the assembly neural network with a perceptron for recognition of handwritten digits arranged in numeral strings. Pattern Recognition. 2005, Vol. 38, N 3, pp. 315-322.

93. Kasatkina L.M., Lukovich V.V., Pilipenko V.V. Recognition of the person’s voice by the classifier LIRA. Control Systems and Computers. 2006, N. 3, pp. 67-73.

94. Slipchenko S. V. Distributed representations for the processing of hierarchically structured numerical and symbolic information. System Technologies. 2005, N 6, pp. 134-141. (In Russian)

95. Rachkovskij D.A., Revunova E.G. Intelligent gamma-ray data processing for environmental monitoring. In: Intelligent Data Processing in Global Monitoring for Environment and Security. Kiev-Sofia: ITHEA, 2011, pp. 136-157.

96. Revunova E.G. Rachkovskij D.A. Training a linear neural network with a stable LSP solution for jamming cancellation. Intern. Journal Information Theories and Applications. 2005, Vol.12, N 3, pp. 224-230.

97. Revunova E.G., Rachkovskij D.A. Random projection and truncated SVD for estimating direction of arrival in antenna array. Cybernetics and Computer Engineering. 2018, N 3(193), pp. 5-26.

98. Hinton G. How to represent part-whole hierarchies in a neural network. arXiv:2102.12627. 2021, pp. 1-44.

99. Goyal A., Bengio Y. Inductive biases for deep learning of higher-level cognition. arXiv:2011.15091. 2020, pp. 1-42.

100. Greff K., van Steenkiste S., Schmidhuber J. On the binding problem in artificial neural networks. arXiv:2012.05208. 2020, pp. 1-75.

Received 17.03.2022