Issue 1 (191), article 3


Kibern. vyčisl. teh., 2018, Issue 1 (191), pp.

Grytsenko V.I., Corresponding Member of NAS of Ukraine,
Director of International Research and Training
Center for Information Technologies and Systems
of the National Academy of Sciences of Ukraine
and Ministry of Education and Science of Ukraine
Volkov О.Y., Senior Researcher,
Intellectual Control Department
Komar M.M., Researcher,
Intellectual Control Department
Bogachuk Y.P., PhD (Engineering), Senior Researcher,
Intellectual Control Department
International Research and Training Center for Information
Technologies and Systems of the National Academy
of Sciences of Ukraine and Ministry of Education and Science of Ukraine,
40, Acad. Glushkov av., 03187, Kiev, Ukraine


Introduction. The article discusses the actual questions of the need of creation of modern systems of automatic control of unmanned aerial vehicle (UAV) and describes new methods of its intellectualization. Today’s development of information technology requires accelerated development of the theory of intellectual control and the theory of systemic information technology. New technologies of intellectual control are extremely important for solving the problems of modern unmanned aviation.
The purpose of the article is to solve the issues of the development of the control system of UAV and to provide a number of measures aimed to ensuring its intellectualization. The approach considered in the article is based on the theory of high-precision remote control of dynamic objects and on the complex interaction of methods of theory of invariance, adaptive control and intellectualization of processes of UAV control.
Results. The development and implementation of control algorithms using functional program modules written in modern high-level programming languages in the computer environment based on microprocessors with a computing power sufficient to implement these algorithms in the form of a unified hardware and software complex of the integrated avionics.
The expansion of the range of functional capabilities of UAV control system that is offered to supplement the developed channels and algorithms of autopilot by the methods of intellectualization.
Conclusions. It is shown that combining the developed control laws for UAV autopilot into a unified hardware and software complex of integrated avionics and supplementing them with the proposed components of intellectualization will create a synergy effect and ensure the effectiveness and sustainability of the process of controlling the motion of the UAV.

Keywords: unmanned aerial vehicle, control system, invariance, intellectualization,

Download full text (ua)!


  1. Krasil’shchikovM.N., SerebryakovG.G.Modern information technologies in the tasks of navigation and guidance of unmanned maneuverable aircrafts. Moscow: FIZMATLIT, 2009. 556 p. (in Russian).
  2. Kharchenko V.P., Chepizhenko V.I., Tunik A.А., Pavlova S.V. Avionic-sofunmannedaerialvehicles. Kyiv: Abris-Print, 2012. 464 p. (in Ukrainian)
  3. Fedosov E.A., Bobronnikov V.T., Kukhtenko V.I. Dynamic design of control systems for automatic maneuverable aircrafts. Moscow: Mashinostroyeniye, 1997. 336 p. (In Russian).
  4. Pavlova S., Komar M. The Invariant Adaptation of the Aircraft Control System in Emergency Situation During the Flight. ProceedingoftheNationalAviationUniversity. 2016. № 4(69). P. 28–33.
  5. Fahlstrom P., Gleason T. Introduction to UAV systems. Hoboken: Wiley, 2012. 4th ed. 308 p.
  6. Kortunov V.I., Mazurenko A.V., AliHusseinVaticMohammedControlsofminiandmicro-UAVs. Radiotelectronicandcomputersystems. 2016. № 1. P. 45–55 (In Russian).
  7. Austin R. Unmannedaircraftsystems. UAVsdesign, developmentanddeployment. JohnWiley&Sons, 2010. 372 p.
  8. Randal W. Beard, Timothy W. McLaine Small unmanned aerial vehicles: theory and practice. Moscow: TEKHNOSFERA, 2015. 312 c.
  9. AlyoshinB.S.Orientationandnavigationofmobileobjects: moderninformationtechnologies. Moscow: FIZMATLIT, 2006. 424 p. (In Russian).
  10. Volkov A.E., Pavlova S.V. Modelingoftheinvariantmethodforresolvingthedynamic-conflictsofaircraft. Cyberneticsandsystemsanalysis. 2017. № 53 (4). P. 105–112 (In Russian).
  11. Voloshenyuk D.A., Pavlova S.V. Managementofaircraftlandinginconditionsofincreasing-airtraffic. Controlsystemsandmachines. 2017 № 5. P. 62–74 (In Russian).

Received 27.12.2017

Issue 4 (190), article 1


Kibern. vyčisl. teh., 2017, Issue 4 (190), pp.

Grytsenko V.I., Corresponding Member of NASU of Ukraine,
Director of International research and training
center for Information technologies and systems
of the NASU and MESU
Onyshchenko I.M., PhD (Economics),
Senior Researcher of the Department of Economic and Social
Systems and Information Technologies
International research and training center for Information
technologies and systems of the NASU and MESU
40, Ave Glushkov, 03680, Kiev, Ukraine


Introduction. Fast growth of collected and stored data due to IT bumming caused a problem called “Big Data Problem”. Most of the new data are unstructured and this is the core reason why traditional relational data warehouse are so inefficient to deal with “Big Data”. Predicting and modeling based on “Big Data” also can be problematic because of high volume and velocity. To avoid some problems online learning algorithms can be successful for high-load systems.
The purpose of the article is to develop an approach to feature selection and modeling in case of “Big Data” with using online learning algorithm.
Method. Online learning algorithm for FTRL (Follow-The-Regularized-Leader) model with L1 and L2 regularization to select only important features was used.
Results. The approaches of modeling in cases of using batch and online learning algorithms are described on the example of online auction system. The online learning algorithm has very strong preferences in case of high load and high velocity. Mathematical background for modification of linear discriminator of FTL (Follow-The-Leader) model with adding regularization was described. L1 and L2 regularization allows us to select important features in real time. If the feature becomes useless, the regularization will set the corresponding coefficient equal to 0. But it does not remove the feature from training process and the coefficient can be restored with some value in case of its importance for model. The full process is prepared as a program in Python and can be used in practice.
The results may be applied for modeling and predicting in projects with high volume or velocity of data for example — social networks, online auctions, online gaming, recommendation systems and others.
The results may be applied for modeling and forcasting in projects with high volume or velocity of data, for example — social networks, online auctions, online gaming, recommendation systems and others .
Conclusions. FTRL model to work as online learning algorithm that allows to predict binary outcomes in high load “Big Data” systems was modified.
Getting into account that number of predictors can be enormous it takes much computing resources, time and make the process difficult. This feature selection problem was solved with using L1 regularization. The selection procedure was added to modified online learning FTRL model. L1 regularization to score the importance of predictors in real time was used.
A program that runs described mathematical algorithm was developed. Note that the algorithm effectively works with sparse matrices by analyzing incoming data and updating weights only for predictors that are presented. The algorithm has L1 and L2 regularization features that may be used for feature selection and avoid overfitting.
Keywords: information technologies in economics, economical and mathematical modeling, online learning algorithms, regularization, Big Data.

Download full text (ua)!


  1. Maier-Shenberher Vyktor. Bolshye dannye. Revoliutsyia, kotoraia yzmenyt to, kak my zhyvem, rabotaem i myslym/Vyktor Maier-Shenberher, Kennet Kuker; per. s anhl. Ynnы Haidiuk. — Moskow: Mann, Yvanovy Ferber, 2014. — 240 p. (in Russian).
  2. M. Regelson and D. Fain. Predicting click-through rate using keyword clusters. In Proceedings of the Second Workshop on Sponsored Search Auctions, volume 9623. Citeseer, 2006.
  3. M. Richardson, E. Dominowska, and R. Ragno. Predicting clicks: estimating the click-through rate for new ads. In Proceedings of the 16th international conference on World Wide Web, pages 521–530. ACM, 2007.
  4. Shalev-Shwartz, Shai. “Online Learning and Online Convex Optimization”. Foundations and Trends in Machine Learning. 2011. pp. 107–194.
  5. Gilles Gasso. Batch and online learning algorithms for nonconvex Neyman-Pearson classification / Gilles Gasso, Aristidis Pappaioannou, Marina Spivak, Leon Bottou / ACM Transaction on Intelligent System and Technologies, 2(3), 2011.
  6. H Brendan McMahan. Follow-the-regularized-leader and mirror descent: Equivalence theorems and l1 regularization. International Conference on Artificial Intelligence and Statistics, pages 525–533, 2011.
  7. Byll Frэnks. Ukroshchenye bolshykh dannыkh: kak yzvlekat znanyia yz massyvov ynformatsyy s pomoshchiu hlubokoi analytyky / Byll Frэnks ; per. s anhl. Andreia Baranova. — M. : Mann, Yvanov y Ferber, 2014. — 352 p. (in Russian).
  8. N.B. Shakhovska. Model Velykykh Danykh «Sutnist — kharakterystyka». / N.B. Shakhovska, Yu.Ia. Boliubash / 2015 r. [Elektronnyi resurs] — Rezhym dostupu: D0%A2%D0%95%D0%A0%D0%98%D0%A1%D0%A2%D0%98%D0%9A%D0%90_ (in Ukrainian).
  9. Cherniak Leonyd. Bolshye Dannye — novaia teoryia y praktyka. Otkrytye systemy. SUBD. — M.: Otkrytye systemy, 2011. — № 10. [Elektronnyi resurs] — Rezhym dostupu: (in Russian).
  10. Uskenbaeva, R.K. Tasks of resources provision of distributed computer systems functionality / R.K. Uskenbayeva, A.A. Kuandykov, A.U. Kalizhanova. — Dubai, World Academy of Science, Engineering and Technology. — 2012. — Iss. 70. — P. 580–581.
  11. R. Bekkerman, M. Bilenko, and J. Langford. Scaling up machine learning: Parallel and distributed approaches. 2011
  12. H.B. McMahan. Follow-the-regularized-leader and mirror descent: Equivalence theorems and L1 regularization. In AISTATS, 2011.
  13. H.B. McMahan and M. Streeter. Adaptive bound optimization for online convex optimization. In COLT, 2010.
  14. Hrytsenko V.I. Zastosuvannia instrumentiv Big Data dlia pidvyshchennia efektyvnosti onlain reklamy. Ekonomiko-matematychne modeliuvannia sotsialno-ekonomichnykh system. Vypusk 21. — Kyiv, 2016. P 5–21 (in Ukrainian).
  15. Big Data — Wikipedia. [Elektronnyi resurs] — Rezhym dostupu:
  16. Chto takoe Real-Time Bidding. [Elektronnyi resurs] — rezhym dostupu: (in Russian).
  17. Introduction to online machine learning: Simplified. [Elektronnyi resurs] — rezhym dostupu:
  18. Riedman J. H. Regularization paths for generalized linear models via coordinate descent / Riedman J. H., Hastie T., Tibshirani R. / Journal of Statistical Software. 2010. Vol. 33, no. 1. pp. 1–22
  19. L1- y L2-rehuliaryzatsyia v mashynnom obuchenyy. [Elektronnyi resurs] — rezhym dostupu: (in Russian).
  20. L1-rehuliaryzatsyia lyneinoi rehressyy. Rehressyia naymenshykh uhlov (alhorytm LARS). [Elektronnyi resurs] — rezhym dostupu: chrome-extension: //ecnphlgnajanjnkcmbpancdjoidceilk/content/web/viewer.html?source=extension_pdfhandler & VetrovSem11_LARS.pdf (in Russian).

Received 28.09.2017

Issue 2 (188), article 6

Kibern. vyčisl. teh., 2017, Issue 2 (188), pp.


May 23, 2017 the 80th anniversary of Vladimir Ilyich Gritsenko, known scientist in computer science, information technologies and its applications in economics, industrial and technological field, biological and medical cybernetics, computer technology training, director of the International Scientific and Training Center for Information Technologies and Systems. He is an initiator of development of a new class of high technologies — intelligent information technologies. Gritsenko V.I. is a member of a number of leading international and state councils of Ukraine on informatics, Permanent Representative of Ukraine to the Council of UNESCO Intergovernmental Programme on the information and communications, heads the UNESCO Chair “New Information Technologies in Education for All”, the chief editor of the scientific journals “Control Systems and Machines” and “Cybernetics”.

Download full text (ua)!

Issue 2 (188), article 1


Kibern. vyčisl. teh., 2017, Issue 2 (188), pp.

Grytsenko V.I., Corresponding Member of NAS of Ukraine, Director
International Research and Training Center for Information Technologies and Systems of the NAS of Ukraine and of Ministry of Education and Science of Ukraine,
av. Acad. Glushkova, 40, Kiev, 03680, Ukraine

Rachkovskij D.A., Doctor of Engineering, Leading Researcher,
Dept. of Neural Information Processing Technologies,
International Research and Training Center for Information Technologies and Systems of the NAS of Ukraine and of Ministry of Education and Science of Ukraine,
av. Acad. Glushkova, 40, Kiev, 03680, Ukraine

Frolov A.A., Doctor of Biology, Professor,
Faculty of Electrical Engineering and Computer Science FEI,
Technical University of Ostrava, 17 listopadu 15, 708 33 Ostrava-Poruba, Czech Republic

Gayler R., PhD,
Independent Researcher,
Melbourne, VIC, Australia

Kleyko D., PhD post graduated,
Department of Computer Science, Electrical and Space Engineering,
Lulea University of Technology, 971 87 Lulea, Sweden

Osipov E., PhD, Professor,
Department of Computer Science, Electrical and Space Engineering,
Lulea University of Technology, 971 87 Lulea, Sweden


Introduction. Neural network models of autoassociative, distributed memory allow storage and retrieval of many items (vectors) where the number of stored items can exceed the vector dimension (the number of neurons in the network). This opens the possibility of a sublinear time search (in the number of stored items) for approximate nearest neighbors among vectors of high dimension.

The purpose of the paper is to review models of autoassociative, distributed memory that can be naturally implemented by neural networks (mainly with local learning rules and iterative dynamics based on information locally available to neurons).

Scope. The survey is focused mainly on the networks of Hopfield, Willshaw, and Potts, that have connections between pairs of neurons and operate on sparse binary vectors. We discuss not only autoassociative memory, but also the generalization properties of these networks. We also consider neural networks with higher-order connections, and networks with a bipartite graph structure for non-binary data with linear constraints.

Conclusions. In conclusion we discuss the relations to similarity search, advantages and drawbacks of these techniques, and topics for further research. An interesting and still not completely resolved question is whether neural autoassociative memories can search for approximate nearest neighbors faster than other index structures for similarity search, in particular for the case of very high dimensional vectors.

Keywords: distributed associative memory, sparse binary vector, Hopfield network, Willshaw memory, Potts model, nearest neighbor, similarity search

Download full text!


  1. Abbott L.F., Arian Y. Storage capacity of generalized networks. Physical Review A. 1987. Vol. 36, N 10. P. 5091–5094.
  2. Ahle T.D. Optimal las vegas locality sensitive data structures. arXiv:1704.02054. 6 Apr 2017.
  3. Aliabadi B. K., Berrou C., Gripon V., Jiang X. Storing sparse messages in networks of neural cliques. IEEE Trans. NNLS. 2014. Vol. 25. P. 980–989.
  4. Amari S. Characteristics of sparsely encoded associative memory. Neural Networks. 1989. Vol. 2, N 6. P. 451–457.
  5. Amari S., Maginu K. Statistical neurodynamics of associative memory. Neural Networks. 1988. Vol. 1. P. 63–73.
  6. Amit D.J. Modeling brain function: the world of attractor neural networks. Cambridge: Cambridge University Press, 1989. 554 p.
  7. Amit D.J., Fusi S. Learning in neural networks with material synapses. Neural Computation. 1994. V. 6, N 5. P. 957–982.
  8. Amit D.J., Gutfreund H., Sompolinsky H. Statistical mechanics of neural networks near saturation. Annals of Physics. 1987. Vol. 173. P. 30–67.
  9. Amosov N. M. Modelling of thinking and the mind. New York: Spartan Books. 1967.
  10. Anderson J. A. A theory for the recognition of items from short memorized lists. Psychological Review. 1973. Vol. 80, N 6. P. 417–438.
  11. Anderson J. A. Cognitive and psychological computation with neural models. IEEE trans. Systems, Man, and Cybernetics. 1983. Vol. 13, N 5. P. 799–814.
  12. Anderson J.A., Murphy G.L. Psychological concepts in a parallel system. Physica D. 1986. Vol. 22, N 1–3. P. 318–336.
  13. Anderson J.A., Silverstein J.W., Ritz S.A., Jones R.S. Distinctive features, categorical perception and probability learning: Some applications of a neural model. Psychological Review. 1977. V. 84. P. 413–451.
  14. Andoni A., Laarhoven T., Razenshteyn I., Waingarten E. Optimal hashing-based time-space trade-offs for approximate near neighbors. Proc. SODA’17. 2017. P. 47–66.
  15. Baidyk T.N., Kussul E.M. Structure of neural assembly. Proc. RNNS/IEEE symposium on neuroinformatics and neurocomputers. 1992. P. 423–434.
  16. Baidyk T.N., Kussul E.M., Rachkovskij D.A. Numerical-analytical method for neural network investigation. Proc. NEURONET’90. 1990. P. 217–219.
  17. Baldi, P. and Venkatesh, S.S. Number of stable points for spin-glasses and neural networks of higher orders. Physical Review Letters. 1987. Vol. 58, N 9. P. 913–916.
  18. Becker A., Ducas L., Gama N., Laarhoven T. New directions in nearest neighbor searching with applications to lattice sieving. Proc. SODA’16. 2016. P. 10–24.
  19. Boguslawski B., Gripon V., Seguin F., Heitzmann F. Twin neurons for efficient real-world data distribution in networks of neural cliques: Applications in power management in electronic circuits. IEEE Trans. NNLS. 2016. Vol. 27, N 2. P. 375–387.
  20. Bovier A. Sharp upper bounds on perfect retrieval in the Hopfield model. J. Appl. Probab. 1999. Vol. 36, N 3. P. 941–950.
  21. Braitenberg V. Cell assemblies in the cerebral cortex. In Theoretical approaches to complex systems. Berlin: Springer-Verlag. 1978. P. 171–188.
  22. Broder A., Mitzenmacher M. Network applications of Bloom filters: A survey. Internet mathematics. 2004. Vol. 1, N 4. P. 485–509.
  23. Brunel N., Carusi F., Fusi S. Slow stochastic Hebbian learning of classes of stimuli in a recurrent neural network. Network. 1998. Vol. 9. P. 123–152.
  24. Buckingham J., Willshaw D. On setting unit thresholds in an incompletely connected associative net. Network. 1993. Vol. 4. P. 441–459.
  25. Burshtein D. Non-direct convergence radius and number of iterations of the Hopfield associative memory. IEEE Trans. Inform. Theory. 1994. Vol. 40. P. 838–847.
  26. Burshtein D. Long-term attraction in higher order neural networks. IEEE Trans. Neural Networks. 1998. Vol. 9, N 1. P. 42–50.
  27. Christiani T., Pagh R. Set similarity search beyond MinHash. Proc. STOC’17. 2017.
  28. Cole R., Gottlieb L.-A., Lewenstein M. Dictionary matching and indexing with errors and don’t cares. Proc. STOC’04. 2004. P. 91–100.
  29. Dahlgaard S., Knudsen M.B.T., Thorup M. Fast similarity sketching. arXiv:1704.04370. 14 Apr 2017.
  30. Demircigil M., Heusel J., Lowe M., Upgang S. Vermet F. On a model of associative memory with huge storage capacity. J. Stat. Phys. doi:10.1007/s10955-017-1806-y. 2017.
  31. Donaldson R., Gupta A, Plan Y., Reimer T. Random mappings designed for commercial search engines. arXiv:1507.05929. 21 Jul 2015.
  32. Feigelman M.V., Ioffe L.B. The augmented models of associative memory – asymmetric interaction and hierarchy of patterns. Int. Journal of Modern Physics B. 1987. Vol. 1, N 1, P. 51–68.
  33. Ferdowsi S., Voloshynovskiy S., Kostadinov D., Holotyak T. Fast content identification in highdimensional feature spaces using sparse ternary codes. Proc. WIFS’16. 2016. P. 1–6.
  34. Frolov A.A., Husek D., Muraviev I.P. Information capacity and recall quality in sparsely encoded Hopfield-like neural network: Analytical approaches and computer simulation. Neural Networks. 1997. Vol. 10, N 5. P. 845–855.
  35. Frolov A.A., Husek D., Muraviev I.P. Informational efficiency of sparsely encoded Hopfield-like associative memory. Optical Memory & Neural Networks. 2003. Vol. 12, N 3. P. 177–197.
  36. Frolov A.A., Husek D., Muraviev I.P., Polyakov P. Boolean factor analysis by attractor neural network. IEEE Trans. Neural Networks. 2007. Vol. 18, N 3. P. 698–707.
  37. Frolov A.A., Husek D., Polyakov P.Y. Recurrent neural-network-based boolean factor analysis and its application to word clustering. IEEE Trans. Neural Networks. 2009.Vol. 20, N 7. P. 1073–1086.
  38. Frolov A. A., Husek D., Rachkovskij. Time of searching for similar binary vectors in associative memory. Cybernetics and Systems Analysis. 2006. Vol. 42, N 5. P. 615–623.
  39. Frolov A.A., Muraviev I.P. Neural models of associative memory. Moscow: Nauka, 1987. 161 p.
  40. Frolov A.A., Muraviev I.P. Information characteristics of neural networks. Moscow: Nauka, 1988. 160 p.
  41. Frolov A.A., Muraviev I.P. Information characteristics of neural networks capable of associative learning based on Hebbian plasticity. Network. 1993. Vol. 4, N 4. P. 495–536.
  42. Frolov A., Kartashov A., Goltsev A., Folk R. Quality and efficiency of retrieval for Willshaw-like autoassociative networks. Correction. Network. 1995. Vol. 6. P. 513–534.
  43. Frolov A., Kartashov A., Goltsev A., Folk R. Quality and efficiency of retrieval for Willshaw-like autoassociative networks. Recognition. Network. 1995. Vol. 6. P. 535–549.
  44. Frolov A.A., Rachkovskij D.A., Husek D. On information characteristics of Willshaw-like auto-associative memory. Neural Network World. 2002. Vol. 12, N 2. P. 141–157.
  45. Gallant S.I., Okaywe T.W. Representing objects, relations, and sequences. Neural Computation. 2013. Vol. 25, N 8. P. 2038–2078.
  46. Gardner E. 1987. Multiconnected neural network models. Journal of Physics A. 1998. Vol. 20, N 11. P.3453–3464.
  47. Gardner E. The space of interactions in neural-network models. J. Phys. A. 1988. Vol. 21. P. 257–270.
  48. Gibson W. G., Robinson J. Statistical analysis of the dynamics of a sparse associative memory. Neural Networks. 1992. Vol. 5. P. 645–661.
  49. Golomb D., Rubin N., Sompolinsky H. Willshaw model: Associative memory with sparse coding and low firing rates 1990. Phys Rev A. Vol. 41, N 4. P. 1843–1854.
  50. Goltsev A. An assembly neural network for texture segmentation. Neural Networks. 1996. Vol. 9, N 4. P. 643–653.
  51. Goltsev A. Secondary learning in the assembly neural network. Neurocomputing. 2004. Vol. 62. P. 405–426.
  52. Goltsev A., Húsek D. Some properties of the assembly neural networks. Neural Network World. 2002. Vol. 12, N 1. P. 15–32.
  53. Goltsev A., Wunsch D.C. Generalization of features in the assembly neural networks. International Journal of Neural Systems. 2004. Vol. 14, N 1. P. 1–18.
  54. Goltsev A.D. Neural networks with the assembly organization. Kiev: Naukova Dumka, 2005. 200 p.
  55. Goltsev A., Gritsenko V. Modular neural networks with Hebbian learning rule. Neurocomputing. 2009. Vol. 72. P. 2477–2482.
  56. Goltsev A., Gritsenko V. Modular neural networks with radial neural columnar architecture. Biologically Inspired Cognitive Architectures. 2015. Vol. 13, P. 63–74.
  57. Goswami M., Pagh R., Silvestri F., Sivertsen J. Distance sensitive bloom filters without false negatives. Proc. SODA’17. 2017. P. 257–269.
  58. Gripon V., Berrou C. Sparse neural networks with large learning diversity. IEEE Trans. on Neural Networks. 2011. Vol. 22, N 7. P. 1087–1096.
  59. Gripon V., Heusel J., Lowe M., Vermet F. A comparative study of sparse associative memories. Journal of Statistical Physics. 2016. Vol. 164. P. 105–129.
  60. Gripon V., Lowe M., Vermet F. Associative memories to accelerate approximate nearest neighbor search. ArXiv:1611.05898. 10 Nov 2016.
  61. Gritsenko V.I., Rachkovskij D.A., Goltsev A.D., Lukovych V.V., Misuno I.S., Revunova E.G., Slipchenko S.V., Sokolov A.M., Talayev S.A. Neural distributed representation for intelligent information technologies and modeling of thinking. Cybernetics and Computer Engineering. 2013. Vol. 173. P. 7–24.
  62. Guo J. K., Brackle D. V., Lofaso N., Hofmann M. O. Vector representation for sub-graph encoding to resolve entities. Procedia Computer Science. 2016. Vol. 95. P. 327–334.
  63. Gutfreund H. Neural networks with hierarchically correlated patterns. Physical Review A. 1988. Vol. 37, N 2. P. 570–577.
  64. Hacene G. B., Gripon V., Farrugia N., Arzel M., Jezequel M. Finding all matches in a database using binary neural networks. Proc. COGNITIVE’17. 2017. P. 59–64.
  65. Hebb D.O. The Organization of Behavior: A Neuropsychological Theory. New York: Wiley, 1949. 335 p.
  66. Herrmann M., Ruppin E., Usher M. A neural model of the dynamic activation of memory. Biol. Cybern. 1993. Vol. 68. P. 455–463.
  67. Heusel J., Lowe M., Vermet F. On the capacity of an associative memory model based on neural cliques. Statist. Probab. Lett. 2015. Vol. 106. P. 256–261.
  68. Hopfield J.J. Neural networks and physical systems with emergent collective computational abilities // Proc. of the Nat. Acad. Sci. USA. 1982. Vol. 79, N 8. P. 2554–2558.
  69. Hopfield J.J., Feinstein D.I., Palmer R.G. “Unlearning” has a stabilizing effect in collective memories. Nature. 1983. Vol. 304. P. 158–159.
  70. Horn D., Usher M. Capacities of multiconnected memory models. Journal de Physique, 1988. Vol. 49, N 3. P. 389–395.
  71. Horner H., Bormann D., Frick M., Kinzelbach H., Schmidt A. Transients and basins of attraction in neutral network models. Z. Physik B. 1989. Vol. 76. P. 381–398.
  72. Howard M. W., Kahana M. J. A distributed representation of temporal context. Journal of Mathematical Psychology. 2002. Vol. 46. P. 269–299.
  73. Iscen A., Furon T., Gripon V., Rabbat M., Jegou H. Memory vectors for similarity search in high-dimensional spaces. arXiv:1412.3328. 1 Mar 2017.
  74. Kakeya H., Kindo T. Hierarchical concept formation in associative memory composed of neuro-window elements. Neural Networks. 1996. Vol. 9, N 7. P. 1095–1098.
  75. Kanerva P. Sparse Distributed Memory. Cambridge: MIT Press, 1988. 155 p.
  76. Kanerva P. Hyperdimensional computing: an introduction to computing in distributed representation with high-dimensional random vectors. Cognitive Computation. 2009. Vol. 1, N 2. P. 139–159.
  77. Kanter I. Potts-glass models of neural networks. Physical Rev. A. 1988. V. 37, N 7. P. 2739–2742.
  78. Karbasi A., Salavati A. H., Shokrollahi A. Iterative learning and denoising in convolutional neural associative memories. Proc. ICML’13. 2013. P. 445–453.
  79. Kartashov A., Frolov A., Goltsev A., Folk R. Quality and efficiency of retrieval for Willshaw-like autoassociative networks. Willshaw–Potts model. Network. 1997. Vol. 8, N 1. P. 71–86.
  80. Kinzel W. Learning and pattern recognition in spin glass models. Z. Physik B. 1985. Vol. 60. P. 205–213.
  81. Knoblauch A., Palm G., Sommer F. T. Memory capacities for synaptic and structural plasticity. Neural Computation. 2010. Vol. 22, N 2. P. 289–341.
  82. Kleyko D., Khan S., Osipov E., Yong S. P. Modality Classification of Medical Images with Distributed Representations based on Cellular Automata Reservoir Computing. Proc. ISBI’17. 2017. P. 1–4.
  83. Kleyko D., Lyamin N., Osipov E., Riliskis L. Dependable MAC layer architecture based on holographic data representation using hyperdimensional binary spatter codes. Proc. MACOM’12. 2012. P. 134–145
  84. Kleyko D., Osipov E. Brain-like classifier of temporal patterns. Proc. ICCOINS’14. 2014. P. 1–6.
  85. Kleyko D., Osipov E. On bidirectional transitions between localist and distributed representations: The case of common substrings search using vector symbolic architecture. Procedia Computer Science. 2014. Vol. 41. P. 104–113.
  86. Kleyko D., Osipov E., Gayler R. W. Recognizing permuted words with Vector Symbolic Architectures: A Cambridge test for machines. Procedia Computer Science. 2016. Vol. 88. P. 169–175.
  87. Kleyko D., Osipov E., Gayler R. W., Khan A. I., Dyer A. G. Imitation of honey bees’ concept learning processes using vector symbolic architectures. Biologically Inspired Cognitive Architectures. 2015. Vol. 14. P. 55–72.
  88. Kleyko D., Osipov E., Papakonstantinou N., Vyatkin V., Mousavi A. Fault detection in the hyperspace: Towards intelligent automation systems. Proc. INDIN’15. 2015. P. 1219–1224.
  89. Kleyko D., Osipov E., Senior A., Khan A. I., Sekercioglu Y. A. Holographic Graph Neuron: a bio-inspired architecture for pattern processing. IEEE Trans. Neural Networks and Learning Systems. 2017. Vol. 28, N 6. P. 1250–1262.
  90. Kleyko D., Osipov E., Rachkovskij D. Modification of holographic graph neuron using sparse distributed representations. Procedia Computer Science. 2016. Vol. 88. P. 39–45.
  91. Kleyko D., Rahimi A., Rachkovskij D.A., Osipov E., Rabaey J.M. Classification and recall with binary hyperdimensional computing: trade-offs in choice of density and mapping characteristics (2017, Submitted).
  92. Kleyko D., Rahimi A, Osipov E. Autoscaling Bloom Filter: controlling trade-off between true and false. arXiv:1705.03934. 10 May 2017
  93. Kohonen T. Content-Addressable Memories. Berlin: Springer, 1987. 388 p.
  94. Kohring G.A. Neural networks with many-neuron interactions. Journal de Physique. 1990. Vol. 51, N 2. P. 145–155.
  95. Krotov D., Hopfield J.J. Dense associative memory for pattern recognition. Proc. NIPS’16. 2016. P. 1172–1180.
  96. Krotov D., Hopfield J.J. Dense associative memory is robust to adversarial inputs. arXiv:1701.00939. 4 Jan 2017
  97. Kryzhanovsky B.V., Mikaelian A.L., Fonarev A.B. Vector neural net identifing many strongly distorted and correlated patterns. Proc. SPIE. 2005. Vol. 5642. 124–133.
  98. Kussul E. M. Associative neuron-like structures. Kiev: Naukova Dumka, 1992.
  99. Kussul E. M., Baidyk T. N. A modular structure of associative-projective neural networks. Preprint 93-6. Kiev, Ukraine: GIC, 1993.
  100. Kussul E.M., Fedoseyeva T.V. On audio signals recognition in neural assembly structures. Preprint 87-28. Kiev: Inst. of Cybern. 1987. 21 pp.
  101. Kussul E., Makeyev O., Baidyk T., Calderon Reyes D. Neural network with ensembles. Proc. IJCNN’10. 2010. P. 2955–2961.
  102. Kussul E.M., Rachkovskij D.A. Multilevel assembly neural architecture and processing of sequences. In Neurocomputers and Attention, V. II: Connectionism and neurocomputers. Manchester and New York: Manchester University Press, 1991. P.577–590.
  103. Kussul E.M., Rachkovskij D.A., Wunsch D.C. The random subspace coarse coding scheme for real-valued vectors. Proc. IJCNN’99. 1999. P. 450-455.
  104. Lansner A. Associative memory models: From the cell assembly theory to biophysically detailed cortex simulations. Trends in Neurosciences. 2009. Vol. 32, N 3. P. 178–186.
  105. Lansner A., Ekeberg O. Reliability and speed of recall in an associative network. IEEE Trans. on Pattern Analysis and Machine Intelligence. 1985. Vol. 7. P. 490–498.
  106. Levy S. D., Gayler R. Vector Symbolic Architectures: A new building material for artificial general intelligence. Proc. AGI’08. 2008. P. 414–418.
  107. Lowe M. On the storage capacity of Hopfield models with correlated patterns. The Annals of Applied Probability. 1998. Vol. 8, N 4. P. 1216–1250.
  108. Lowe M., Vermet F. The storage capacity of the Hopfield model and moderate deviations. Statistics and Probability Letters. 2005. Vol. 75. P. 237–248.
  109. Lowe M., Vermet F. The capacity of q-state Potts neural networks with parallel retrieval dynamics. Statistics and Probability Letters. 2007. Vol. 77, N 4. P. 1505–1514.
  110. Mazumdar A., Rawat A.S. Associative memory via a sparse recovery model. Proc. NIPS’15. 2015. P. 2683–2691.
  111. Mazumdar A., Rawat A.S. Associative memory using dictionary learning and expander decoding. Proc. AAAI’17. 2017.
  112. McEliece R.J., Posner E.C., Rodemich E.R., Venkatesh S.S. The capacity of the Hopfield associative memory. IEEE Trans. Information Theory. 1987. Vol. 33, N 4. P. 461–482.
  113. Misuno I.S., Rachkovskij D.A., Slipchenko S.V. Vector and distributed representations reflecting semantic relatedness of words. Math. machines and systems. 2005. N 3. P. 50–67.
  114. Misuno I.S., Rachkovskij D.A., Slipchenko S.V., Sokolov A.M. Searching for text information with the help of vector representations. Probl. Progr. 2005. N 4. P. 50–59.
  115. Nadal J.-P. Associative memory: on the (puzzling) sparse coding limit. J. Phys. A. 1991. Vol. 24. P. 1093–1101.
  116. Norouzi M., Punjani A., Fleet D. J. Fast exact search in Hamming space with multi-index hashing. IEEE Trans. PAMI. 2014. Vol. 36, N 6. P. 1107–1119.
  117. Onizawa N., Jarollahi H., Hanyu T., Gross W.J. Hardware implementation of associative memories based on multiple-valued sparse clustered networks. IEEE Journal on Emerging and Selected Topics in Circuits and Systems. 2016. Vol. 6, N 1. P. 13–24.
  118. Palm G. On associative memory. Biological Cybernetics. 1980. Vol. 36. P. 19–31.
  119. Palm G. Memory capacity of local rules for synaptic modification. Concepts in Neuroscience. 1991. Vol. 2, N 1. P. 97–128.
  120. Palm G. Neural associative memories and sparse coding. Neural Networks. 2013. Vol. 37. P. 165–171.
  121. Palm G., Knoblauch A., Hauser F., Schuz A. Cell assemblies in the cerebral cortex. Biol. Cybern. 2014. Vol. 108, N 5. P. 559–572
  122. Palm G., Sommer F. T. Information capacity in recurrent Mc.Culloch-Pitts networks with sparsely coded memory states. Network. 1992, Vol. 3, P. 177–186.
  123. Parga N., Virasoro M.A. The ultrametric organization of memories in a neural network. J. Mod. Phys. 1986. Vol. 47, N 11. P. 1857–1864.
  124. Peretto P., Niez J.J. Long term memory storage capacity of multiconnected neural networks. Biol. Cybern. 1986. Vol. 54, N 1. P. 53–63.
  125. Personnaz L., Guyon I., Dreyfus G. Collective computational properties of neural networks: New learning mechanisms. Phys. Rev. A. 1986. Vol. 34, N 5. P. 4217–4228.
  126. Plate T. Holographic reduced representation: Distributed representation for cognitive structures. Stanford: CSLI Publications, 2003. 300 p.
  127. Rachkovskij D.A. Representation and processing of structures with binary sparse distributed codes. IEEE Transactions on Knowledge and Data Engineering. 2001. Vol. 13, N 2. P. 261–276.
  128. Rachkovskij D.A. Some approaches to analogical mapping with structure sensitive distributed representations. Journal of Experimental and Theoretical Artificial Intelligence. 2004. Vol. 16, N 3. P. 125–145.
  129. Rachkovskij D.A. Formation of similarity-reflecting binary vectors with random binary projections. Cybernetics and Systems Analysis. 2015. Vol. 51, N 2. P. 313–323.
  130. Rachkovskij D.A. Estimation of vectors similarity by their randomized binary projections. Cybernetics and Systems Analysis. 2015. Vol. 51, N 5. P. 808–818.
  131. Rachkovskij D.A. Binary vectors for fast distance and similarity estimation. Cybernetics and Systems Analysis. 2017. Vol. 53, N 1. P. 138–156.
  132. Rachkovskij D.A. Distance-based index structures for fast similarity search. Cybernetics and Systems Analysis. 2017. Vol. 53, N 4.
  133. Rachkovskij D.A. Index structures for fast similarity search of binary vectors. Cybernetics and Systems Analysis. 2017. Vol. 53, N 5.
  134. Rachkovskij D.A., Kussul E.M., Baidyk T.N. Building a world model with structure-sensitive sparse binary distributed representations. BICA. 2013. Vol. 3. P. 64–86.
  135. Rachkovskij D.A., Misuno I.S., Slipchenko S.V. Randomized projective methods for construction of binary sparse vector representations. Cybernetics and Systems Analysis. 2012. Vol. 48, N 1. P. 146–156.
  136. Rachkovskij D.A., Slipchenko S.V. Similarity-based retrieval with structure-sensitive sparse binary distributed representations. Computational Intelligence. 2012. Vol. 28, N 1. P. 106–129.
  137. Rachkovskij D.A., Slipchenko S.V., Kussul E.M., Baidyk T.N. Sparse binary distributed encoding of scalars. J. of Automation and Inf. Sci. 2005. Vol. 37, N 6. P. 12–23.
  138. Rachkovskij D.A., Slipchenko S.V., Kussul E.M., Baidyk T.N. Properties of numeric сodes for the scheme of random subspaces RSC. Cybernetics and Systems Analysis. 2005. 41, N 4. P. 509–520.
  139. Rachkovskij D.A., Slipchenko S.V., Misuno I.S., Kussul E.M., Baidyk T.N. Sparse binary distributed encoding of numeric vectors. J. of Automation and Inf. Sci. 2005. Vol. 37, N 11. P. 47–61.
  140. Reznik A.M., Sitchov A.S., Dekhtyarenko O.K., Nowicki D.W. Associative memories with killed neurons: the methods of recovery. Proc. IJCNN’03. 2003. P. 2579–2582.
  141. Rizzuto D.S., Kahana M.J. An autoassociative neural network model of paired-associate learning. Neural Computation. 2001. Vol. 13. P. 2075–2092.
  142. Rolls E. T. Advantages of dilution in the connectivity of attractor networks in the brain. Biologically Inspired Cognitive Architectures. 2012. Vol. 1, P. 44–54.
  143. Romani S., Pinkoviezky I., Rubin A., Tsodyks M. Scaling laws of associative memory retrieval. Neural Computation. 2013. Vol. 25, N 10. P. 2523–2544.
  144. Rosenfeld R., Touretzky D.S. Coarse-coded symbol memories and their properties. Complex Systems. 1988. Vol. 2, N 4. P. 463–484.
  145. Salavati A.H., Kumar K.R., Shokrollahi A. Nonbinary associative memory with exponential pattern retrieval capacity and iterative learning. IEEE Trans. Neural Networks and Learning Systems. 2014. Vol. 25, N 3. P. 557–570.
  146. Schwenker F., Sommer F.T., Palm G. Iterative retrieval of sparsely coded associative memory patterns. Neural Networks. 1996. Vol. 9. P. 445-455.
  147. Shrivastava A., Li P. In defense of minhash over simhash. Proc. AISTATS’14. 2014. P. 886–894.
  148. Slipchenko S.V., Rachkovskij D.A. Analogical mapping using similarity of binary distributed representations. International Journal Information Theories and Applications. 2009. Vol. 16, N 3. P. 269–290.
  149. KO5. Tarkoma S., Rothenberg C. E., Lagerspetz E. Theory and Practice of Bloom Filters for Distributed Systems. IEEE Communications Surveys and Tutorials. 2012. Vol. 14, N 1. P. 131–155.
  150. Tsodyks M. Associative memory in asymmetric diluted network with low level of activity. Europhysics Letters. 1988. Vol. 7, N 3. 203–208.
  151. Tsodyks M. Associative memory in neural networks with the hebbian learning rule. Modern Physics Letters B. 1989. Vol. 3, N 7. P. 555–560.
  152. Tsodyks M.V. Associative memory in neural networks with binary synapses. Mod. Phys. Lett. 1990. Vol. B4. P. 713–716.
  153. Tsodyks M.V. Hierarchical associative memory in neural networks with low activity level. Modern Physics Letters B. 1990. Vol. 4, N 4. P. 259–265.
  154. Tsodyks M., Feigelman M. The enhanced storage capacity in neural networks with low activity level. Europhysics Letters. 1988. Vol. 6, N 2. P. 101–105.
  155. Vedenov A. A., Ezhov A.A., Knizhnikova L.A., Levchenko E.B. “Spurious memory” in model neural networks. Preprint IAE-4395/1. 1987. Moscow: KIAE.
  156. Willshaw D. Holography, associative memory and inductive generalization. In Parallel Models of Associative Memory. Hillside: Lawrence Erlbaum Associates. 1981. P. 83–104.
  157. Willshaw D. J., Buneman O. P., Longuet-Higgins H. C. Non-holographic associative memory. Nature. Vol. 1969. Vol. 222. P. 960–962.
  158. Yang X., Vernitski A., Carrea L. An approximate dynamic programming approach for improving accuracy of lossy data compression by Bloom filters. European Journal of Operational Research. 2016. Vol. 252, N 3. P 985–994.
  159. Yu C., Gripon V., Jiang X., Jegou H. Neural associative memories as accelerators for binary vector search. Proc. COGNITIVE’15. 2015. P. 85–89.

Recieved 15.04.2017

Issue 1 (187), article 1


Kibern. vyčisl. teh., 2017, Issue 1 (187), pp.5-11

Grytsenko V.I., Corresponding Member of NAS of Ukraine, Director of International
Research and Training Center for Information Technologies and Systems of National
Academy of Sciences of Ukraine and Ministry of Education and Science of Ukraine



May 5, 1997 the International Research and Training Center for Information Technologies and Systems NAS and MES of Ukraine was established by National Academy of Sciences of Ukraine.

During 20 years new scientific direction — Intelligent Information Technology (IIT), was formed. This methodology, the software and hardware became the basis for the deve-lopment of IIT of imaginative thinking, neural network technology, IIT for digital medicine, the E-education and intelligent control technologies.

The basic directions of fundamental and applied scientific research in the International Center are: creation of intelligent information technologies based on methods and means of imaginative thinking, comprehensive research of problems of intelligent management, intelligent robotics, digital medicine, e-learning, digital information space and technologies for the development of a secure information society.

By the main directions of the International Center, scientific schools in the field of information technologies and systems, technical cybernetics, biological and medical cybernetics, and mathematical analysis of comprehensive economic systems have been formed. An important contribution to the development of these scientific schools was made by outstanding Ukrainian scientists — academicians V.I. Skurikhin, A.G. Ivakhnenko,
N.M. Amosov and A.A. Bakaev. Their students and followers successfully develop these scientific directions in our country and abroad.

The International Center is the initiator of research and development of the concept of a new class of information technologies — intelligent information technologies. These are special, knowledge-intensive information technologies that differ from the known IT in the new quality — operating images of information objects. At the same time, an understanding of human speech, recognition of real and artificially created objects, active interaction with the environment, revealing the essence of the phenomenon, operating knowledge and the choice of strategy and tactics for achieving the set goal are achieved through the contours of intellectual IT.

Technical Committee for Standardization of information technologies, scientific journals “Control Systems and Computers” and “Cibernatics and Computer Engineering”, presentations of our scientists at prestigious international conferences, symposia and exhibitions make an important contribution for increasing the authority of the International Center.

The International Center has formed a program of work for the nearest years and defined the mechanisms for its implementation in the context of the rapid development of intellectualization of information technologies in all spheres of our society. As the comprehensive analysis showed, this program fully corresponds to global trends that the term “digital transformation” characterizes and covers the research priorities in information technology for a period of 5–10 years.

Keywords: intelligent information technology, imaginative thinking, intelligent management, digital medicine, e-learning, robotics, information society.

Download full text (ua)!