Issue 2 (188), article 1

DOI:https://doi.org/10.15407/kvt188.02.005

Kibern. vyčisl. teh., 2017, Issue 2 (188), pp.

Grytsenko V.I., Corresponding Member of NAS of Ukraine, Director
e-mail: vig@irtc.org.ua
International Research and Training Center for Information Technologies and Systems of the NAS of Ukraine and of Ministry of Education and Science of Ukraine,
av. Acad. Glushkova, 40, Kiev, 03680, Ukraine

Rachkovskij D.A., Doctor of Engineering, Leading Researcher,
Dept. of Neural Information Processing Technologies,
e-mail: dar@infrm.kiev.ua
International Research and Training Center for Information Technologies and Systems of the NAS of Ukraine and of Ministry of Education and Science of Ukraine,
av. Acad. Glushkova, 40, Kiev, 03680, Ukraine

Frolov A.A., Doctor of Biology, Professor,
Faculty of Electrical Engineering and Computer Science FEI,
e-mail: docfact@gmail.com
Technical University of Ostrava, 17 listopadu 15, 708 33 Ostrava-Poruba, Czech Republic

Gayler R., PhD,
Independent Researcher,
r.gayler@gmail.com
Melbourne, VIC, Australia

Kleyko D., PhD post graduated,
Department of Computer Science, Electrical and Space Engineering,
denis.kleyko@ltu.se
Lulea University of Technology, 971 87 Lulea, Sweden

Osipov E., PhD, Professor,
Department of Computer Science, Electrical and Space Engineering,
evgeny.osipov@ltu.se
Lulea University of Technology, 971 87 Lulea, Sweden

NEURAL DISTRIBUTED AUTOASSOCIATIVE MEMORIES: A SURVEY.

Introduction. Neural network models of autoassociative, distributed memory allow storage and retrieval of many items (vectors) where the number of stored items can exceed the vector dimension (the number of neurons in the network). This opens the possibility of a sublinear time search (in the number of stored items) for approximate nearest neighbors among vectors of high dimension.

The purpose of the paper is to review models of autoassociative, distributed memory that can be naturally implemented by neural networks (mainly with local learning rules and iterative dynamics based on information locally available to neurons).

Scope. The survey is focused mainly on the networks of Hopfield, Willshaw, and Potts, that have connections between pairs of neurons and operate on sparse binary vectors. We discuss not only autoassociative memory, but also the generalization properties of these networks. We also consider neural networks with higher-order connections, and networks with a bipartite graph structure for non-binary data with linear constraints.

Conclusions. In conclusion we discuss the relations to similarity search, advantages and drawbacks of these techniques, and topics for further research. An interesting and still not completely resolved question is whether neural autoassociative memories can search for approximate nearest neighbors faster than other index structures for similarity search, in particular for the case of very high dimensional vectors.

Keywords: distributed associative memory, sparse binary vector, Hopfield network, Willshaw memory, Potts model, nearest neighbor, similarity search

Download full text!

REFERENCE

1 Abbott L.F., Arian Y. Storage capacity of generalized networks. Physical Review A. 1987. Vol. 36, N 10. P. 5091–5094. https://doi.org/10.1103/PhysRevA.36.5091

2 Ahle T.D. Optimal las vegas locality sensitive data structures. arXiv:1704.02054. 6 Apr 2017.

3 Aliabadi B. K., Berrou C., Gripon V., Jiang X. Storing sparse messages in networks of neural cliques. IEEE Trans. NNLS. 2014. Vol. 25. P. 980–989. https://doi.org/10.1109/TNNLS.2013.2285253

4 Amari S. Characteristics of sparsely encoded associative memory. Neural Networks. 1989. Vol. 2, N 6. P. 451–457. https://doi.org/10.1016/0893-6080(89)90043-9

5 Amari S., Maginu K. Statistical neurodynamics of associative memory. Neural Networks. 1988. Vol. 1. P. 63–73. https://doi.org/10.1016/0893-6080(88)90022-6

6 Amit D.J. Modeling brain function: the world of attractor neural networks. Cambridge: Cambridge University Press, 1989. 554 p. https://doi.org/10.1017/CBO9780511623257

7 Amit D.J., Fusi S. Learning in neural networks with material synapses. Neural Computation. 1994. V. 6, N 5. P. 957–982. https://doi.org/10.1162/neco.1994.6.5.957

8 Amit D.J., Gutfreund H., Sompolinsky H. Statistical mechanics of neural networks near saturation. Annals of Physics. 1987. Vol. 173. P. 30–67. https://doi.org/10.1016/0003-4916(87)90092-3

9 Amosov N. M. Modelling of thinking and the mind. New York: Spartan Books. 1967. https://doi.org/10.1007/978-1-349-00640-3

10 Anderson J. A. A theory for the recognition of items from short memorized lists. Psychological Review. 1973. Vol. 80, N 6. P. 417–438. https://doi.org/10.1037/h0035486

11 Anderson J. A. Cognitive and psychological computation with neural models. IEEE trans. Systems, Man, and Cybernetics. 1983. Vol. 13, N 5. P. 799–814. https://doi.org/10.1109/TSMC.1983.6313074

12 Anderson J.A., Murphy G.L. Psychological concepts in a parallel system. Physica D. 1986. Vol. 22, N 1–3. P. 318–336. https://doi.org/10.1016/0167-2789(86)90302-2

13 Anderson J.A., Silverstein J.W., Ritz S.A., Jones R.S. Distinctive features, categorical perception and probability learning: Some applications of a neural model. Psychological Review. 1977. V. 84. P. 413–451. https://doi.org/10.1037/0033-295X.84.5.413

14 Andoni A., Laarhoven T., Razenshteyn I., Waingarten E. Optimal hashing-based time-space trade-offs for approximate near neighbors. Proc. SODA’17. 2017. P. 47–66. https://doi.org/10.1137/1.9781611974782.4

15 Baidyk T.N., Kussul E.M. Structure of neural assembly. Proc. RNNS/IEEE symposium on neuroinformatics and neurocomputers. 1992. P. 423–434.

16 Baidyk T.N., Kussul E.M., Rachkovskij D.A. Numerical-analytical method for neural network investigation. Proc. NEURONET’90. 1990. P. 217–219.

17 Baldi, P. and Venkatesh, S.S. Number of stable points for spin-glasses and neural networks of higher orders. Physical Review Letters. 1987. Vol. 58, N 9. P. 913–916. https://doi.org/10.1103/PhysRevLett.58.913

18 Becker A., Ducas L., Gama N., Laarhoven T. New directions in nearest neighbor searching with applications to lattice sieving. Proc. SODA’16. 2016. P. 10–24. https://doi.org/10.1137/1.9781611974331.ch2

19 Boguslawski B., Gripon V., Seguin F., Heitzmann F. Twin neurons for efficient real-world data distribution in networks of neural cliques: Applications in power management in electronic circuits. IEEE Trans. NNLS. 2016. Vol. 27, N 2. P. 375–387. https://doi.org/10.1109/TNNLS.2015.2480545

20 Bovier A. Sharp upper bounds on perfect retrieval in the Hopfield model. J. Appl. Probab. 1999. Vol. 36, N 3. P. 941–950. https://doi.org/10.1239/jap/1032374647
https://doi.org/10.1017/S0021900200017708

21 Braitenberg V. Cell assemblies in the cerebral cortex. In Theoretical approaches to complex systems. Berlin: Springer-Verlag. 1978. P. 171–188. https://doi.org/10.1007/978-3-642-93083-6_9

22 Broder A., Mitzenmacher M. Network applications of Bloom filters: A survey. Internet mathematics. 2004. Vol. 1, N 4. P. 485–509. https://doi.org/10.1080/15427951.2004.10129096

23 Brunel N., Carusi F., Fusi S. Slow stochastic Hebbian learning of classes of stimuli in a recurrent neural network. Network. 1998. Vol. 9. P. 123–152. https://doi.org/10.1088/0954-898X_9_1_007

24 Buckingham J., Willshaw D. On setting unit thresholds in an incompletely connected associative net. Network. 1993. Vol. 4. P. 441–459. https://doi.org/10.1088/0954-898X_4_4_003

25 Burshtein D. Non-direct convergence radius and number of iterations of the Hopfield associative memory. IEEE Trans. Inform. Theory. 1994. Vol. 40. P. 838–847. https://doi.org/10.1109/18.335894

26 Burshtein D. Long-term attraction in higher order neural networks. IEEE Trans. Neural Networks. 1998. Vol. 9, N 1. P. 42–50. https://doi.org/10.1109/72.655028

27 Christiani T., Pagh R. Set similarity search beyond MinHash. Proc. STOC’17. 2017. https://doi.org/10.1145/3055399.3055443

28 Cole R., Gottlieb L.-A., Lewenstein M. Dictionary matching and indexing with errors and don’t cares. Proc. STOC’04. 2004. P. 91–100. https://doi.org/10.1145/1007352.1007374

29 Dahlgaard S., Knudsen M.B.T., Thorup M. Fast similarity sketching. arXiv:1704.04370. 14 Apr 2017.

30 Demircigil M., Heusel J., Lowe M., Upgang S. Vermet F. On a model of associative memory with huge storage capacity. J. Stat. Phys. doi:10.1007/s10955-017-1806-y. 2017. https://doi.org/10.1007/s10955-017-1806-y

31 Donaldson R., Gupta A, Plan Y., Reimer T. Random mappings designed for commercial search engines. arXiv:1507.05929. 21 Jul 2015.

32 Feigelman M.V., Ioffe L.B. The augmented models of associative memory – asymmetric interaction and hierarchy of patterns. Int. Journal of Modern Physics B. 1987. Vol. 1, N 1, P. 51–68. https://doi.org/10.1142/S0217979287000050

33 Ferdowsi S., Voloshynovskiy S., Kostadinov D., Holotyak T. Fast content identification in highdimensional feature spaces using sparse ternary codes. Proc. WIFS’16. 2016. P. 1–6.

34 Frolov A.A., Husek D., Muraviev I.P. Information capacity and recall quality in sparsely encoded Hopfield-like neural network: Analytical approaches and computer simulation. Neural Networks. 1997. Vol. 10, N 5. P. 845–855. https://doi.org/10.1016/S0893-6080(96)00122-0

35 Frolov A.A., Husek D., Muraviev I.P. Informational efficiency of sparsely encoded Hopfield-like associative memory. Optical Memory & Neural Networks. 2003. Vol. 12, N 3. P. 177–197.

36 Frolov A.A., Husek D., Muraviev I.P., Polyakov P. Boolean factor analysis by attractor neural network. IEEE Trans. Neural Networks. 2007. Vol. 18, N 3. P. 698–707. https://doi.org/10.1109/TNN.2007.891664

37 Frolov A.A., Husek D., Polyakov P.Y. Recurrent neural-network-based boolean factor analysis and its application to word clustering. IEEE Trans. Neural Networks. 2009.Vol. 20, N 7. P. 1073–1086. https://doi.org/10.1109/TNN.2009.2016090

38 Frolov A. A., Husek D., Rachkovskij. Time of searching for similar binary vectors in associative memory. Cybernetics and Systems Analysis. 2006. Vol. 42, N 5. P. 615–623. https://doi.org/10.1007/s10559-006-0098-z

39 Frolov A.A., Muraviev I.P. Neural models of associative memory. Moscow: Nauka, 1987. 161 p.

40 Frolov A.A., Muraviev I.P. Information characteristics of neural networks. Moscow: Nauka, 1988. 160 p.

41 Frolov A.A., Muraviev I.P. Information characteristics of neural networks capable of associative learning based on Hebbian plasticity. Network. 1993. Vol. 4, N 4. P. 495–536. https://doi.org/10.1088/0954-898X_4_4_006

42 Frolov A., Kartashov A., Goltsev A., Folk R. Quality and efficiency of retrieval for Willshaw-like autoassociative networks. Correction. Network. 1995. Vol. 6. P. 513–534. https://doi.org/10.1088/0954-898X_6_4_001
https://doi.org/10.1088/0954-898X_6_4_002

43 Frolov A., Kartashov A., Goltsev A., Folk R. Quality and efficiency of retrieval for Willshaw-like autoassociative networks. Recognition. Network. 1995. Vol. 6. P. 535–549. https://doi.org/10.1088/0954-898X_6_4_001
https://doi.org/10.1088/0954-898X_6_4_002

44 Frolov A.A., Rachkovskij D.A., Husek D. On information characteristics of Willshaw-like auto-associative memory. Neural Network World. 2002. Vol. 12, N 2. P. 141–157.

45 Gallant S.I., Okaywe T.W. Representing objects, relations, and sequences. Neural Computation. 2013. Vol. 25, N 8. P. 2038–2078. https://doi.org/10.1162/NECO_a_00467

46 Gardner E. 1987. Multiconnected neural network models. Journal of Physics A. 1998. Vol. 20, N 11. P.3453–3464. https://doi.org/10.1088/0305-4470/20/11/046

47 Gardner E. The space of interactions in neural-network models. J. Phys. A. 1988. Vol. 21. P. 257–270. https://doi.org/10.1088/0305-4470/21/1/030

48 Gibson W. G., Robinson J. Statistical analysis of the dynamics of a sparse associative memory. Neural Networks. 1992. Vol. 5. P. 645–661. https://doi.org/10.1016/S0893-6080(05)80042-5

49 Golomb D., Rubin N., Sompolinsky H. Willshaw model: Associative memory with sparse coding and low firing rates 1990. Phys Rev A. Vol. 41, N 4. P. 1843–1854.

50 Goltsev A. An assembly neural network for texture segmentation. Neural Networks. 1996. Vol. 9, N 4. P. 643–653. https://doi.org/10.1016/0893-6080(95)00136-0

51 Goltsev A. Secondary learning in the assembly neural network. Neurocomputing. 2004. Vol. 62. P. 405–426. https://doi.org/10.1016/j.neucom.2004.06.001
https://doi.org/10.1016/S0925-2312(04)00305-4

52 Goltsev A., Husek D. Some properties of the assembly neural networks. Neural Network World. 2002. Vol. 12, N 1. P. 15–32.

53 Goltsev A., Wunsch D.C. Generalization of features in the assembly neural networks. International Journal of Neural Systems. 2004. Vol. 14, N 1. P. 1–18. https://doi.org/10.1142/S0129065704001838

54 Goltsev A.D. Neural networks with the assembly organization. Kiev: Naukova Dumka, 2005. 200 p.

55 Goltsev A., Gritsenko V. Modular neural networks with Hebbian learning rule. Neurocomputing. 2009. Vol. 72. P. 2477–2482. https://doi.org/10.1016/j.neucom.2008.11.011

56 Goltsev A., Gritsenko V. Modular neural networks with radial neural columnar architecture. Biologically Inspired Cognitive Architectures. 2015. Vol. 13, P. 63–74. https://doi.org/10.1016/j.bica.2015.06.001

57 Goswami M., Pagh R., Silvestri F., Sivertsen J. Distance sensitive bloom filters without false negatives. Proc. SODA’17. 2017. P. 257–269. https://doi.org/10.1137/1.9781611974782.17

58 Gripon V., Berrou C. Sparse neural networks with large learning diversity. IEEE Trans. on Neural Networks. 2011. Vol. 22, N 7. P. 1087–1096. https://doi.org/10.1109/TNN.2011.2146789

59 Gripon V., Heusel J., Lowe M., Vermet F. A comparative study of sparse associative memories. Journal of Statistical Physics. 2016. Vol. 164. P. 105–129. https://doi.org/10.1007/s10955-016-1530-z

60 Gripon V., Lowe M., Vermet F. Associative memories to accelerate approximate nearest neighbor search. ArXiv:1611.05898. 10 Nov 2016.

61 Gritsenko V.I., Rachkovskij D.A., Goltsev A.D., Lukovych V.V., Misuno I.S., Revunova E.G., Slipchenko S.V., Sokolov A.M., Talayev S.A. Neural distributed representation for intelligent information technologies and modeling of thinking. Cybernetics and Computer Engineering. 2013. Vol. 173. P. 7–24.

62 Guo J. K., Brackle D. V., Lofaso N., Hofmann M. O. Vector representation for sub-graph encoding to resolve entities. Procedia Computer Science. 2016. Vol. 95. P. 327–334. https://doi.org/10.1016/j.procs.2016.09.342

63 Gutfreund H. Neural networks with hierarchically correlated patterns. Physical Review A. 1988. Vol. 37, N 2. P. 570–577. https://doi.org/10.1103/PhysRevA.37.570

64 Hacene G. B., Gripon V., Farrugia N., Arzel M., Jezequel M. Finding all matches in a database using binary neural networks. Proc. COGNITIVE’17. 2017. P. 59–64.

65 Hebb D.O. The Organization of Behavior: A Neuropsychological Theory. New York: Wiley, 1949. 335 p.

66 Herrmann M., Ruppin E., Usher M. A neural model of the dynamic activation of memory. Biol. Cybern. 1993. Vol. 68. P. 455–463. https://doi.org/10.1007/BF00198778

67 Heusel J., Lowe M., Vermet F. On the capacity of an associative memory model based on neural cliques. Statist. Probab. Lett. 2015. Vol. 106. P. 256–261. https://doi.org/10.1016/j.spl.2015.07.026

68 Hopfield J.J. Neural networks and physical systems with emergent collective computational abilities // Proc. of the Nat. Acad. Sci. USA. 1982. Vol. 79, N 8. P. 2554–2558. https://doi.org/10.1073/pnas.79.8.2554

69 Hopfield J.J., Feinstein D.I., Palmer R.G. “Unlearning” has a stabilizing effect in collective memories. Nature. 1983. Vol. 304. P. 158–159. https://doi.org/10.1038/304158a0

70 Horn D., Usher M. Capacities of multiconnected memory models. Journal de Physique, 1988. Vol. 49, N 3. P. 389–395. https://doi.org/10.1051/jphys:01988004903038900

71 Horner H., Bormann D., Frick M., Kinzelbach H., Schmidt A. Transients and basins of attraction in neutral network models. Z. Physik B. 1989. Vol. 76. P. 381–398. https://doi.org/10.1007/BF01321917

72 Howard M. W., Kahana M. J. A distributed representation of temporal context. Journal of Mathematical Psychology. 2002. Vol. 46. P. 269–299. https://doi.org/10.1006/jmps.2001.1388

73 Iscen A., Furon T., Gripon V., Rabbat M., Jegou H. Memory vectors for similarity search in high-dimensional spaces. arXiv:1412.3328. 1 Mar 2017.

74 Kakeya H., Kindo T. Hierarchical concept formation in associative memory composed of neuro-window elements. Neural Networks. 1996. Vol. 9, N 7. P. 1095–1098. https://doi.org/10.1016/0893-6080(96)00030-5

75 Kanerva P. Sparse Distributed Memory. Cambridge: MIT Press, 1988. 155 p.

76 Kanerva P. Hyperdimensional computing: an introduction to computing in distributed representation with high-dimensional random vectors. Cognitive Computation. 2009. Vol. 1, N 2. P. 139–159. https://doi.org/10.1007/s12559-009-9009-8

77 Kanter I. Potts-glass models of neural networks. Physical Rev. A. 1988. V. 37, N 7. P. 2739–2742. https://doi.org/10.1103/PhysRevA.37.2739

78 Karbasi A., Salavati A. H., Shokrollahi A. Iterative learning and denoising in convolutional neural associative memories. Proc. ICML’13. 2013. P. 445–453.

79 Kartashov A., Frolov A., Goltsev A., Folk R. Quality and efficiency of retrieval for Willshaw-like autoassociative networks. Willshaw–Potts model. Network. 1997. Vol. 8, N 1. P. 71–86. https://doi.org/10.1088/0954-898X_8_1_007

80 Kinzel W. Learning and pattern recognition in spin glass models. Z. Physik B. 1985. Vol. 60. P. 205–213. https://doi.org/10.1007/BF01304440

81 Knoblauch A., Palm G., Sommer F. T. Memory capacities for synaptic and structural plasticity. Neural Computation. 2010. Vol. 22, N 2. P. 289–341. https://doi.org/10.1162/neco.2009.08-07-588

82 Kleyko D., Khan S., Osipov E., Yong S. P. Modality Classification of Medical Images with Distributed Representations based on Cellular Automata Reservoir Computing. Proc. ISBI’17. 2017. P. 1–4. https://doi.org/10.1109/ISBI.2017.7950697

83 Kleyko D., Lyamin N., Osipov E., Riliskis L. Dependable MAC layer architecture based on holographic data representation using hyperdimensional binary spatter codes. Proc. MACOM’12. 2012. P. 134–145

84 Kleyko D., Osipov E. Brain-like classifier of temporal patterns. Proc. ICCOINS’14. 2014. P. 1–6. https://doi.org/10.1109/ICCOINS.2014.6868349

85 Kleyko D., Osipov E. On bidirectional transitions between localist and distributed representations: The case of common substrings search using vector symbolic architecture. Procedia Computer Science. 2014. Vol. 41. P. 104–113. https://doi.org/10.1016/j.procs.2014.11.091

86 Kleyko D., Osipov E., Gayler R. W. Recognizing permuted words with Vector Symbolic Architectures: A Cambridge test for machines. Procedia Computer Science. 2016. Vol. 88. P. 169–175. https://doi.org/10.1016/j.procs.2016.07.421

87 Kleyko D., Osipov E., Gayler R. W., Khan A. I., Dyer A. G. Imitation of honey bees’ concept learning processes using vector symbolic architectures. Biologically Inspired Cognitive Architectures. 2015. Vol. 14. P. 55–72. https://doi.org/10.1016/j.bica.2015.09.002

88 Kleyko D., Osipov E., Papakonstantinou N., Vyatkin V., Mousavi A. Fault detection in the hyperspace: Towards intelligent automation systems. Proc. INDIN’15. 2015. P. 1219–1224. https://doi.org/10.1109/INDIN.2015.7281909

89 Kleyko D., Osipov E., Senior A., Khan A. I., Sekercioglu Y. A. Holographic Graph Neuron: a bio-inspired architecture for pattern processing. IEEE Trans. Neural Networks and Learning Systems. 2017. Vol. 28, N 6. P. 1250–1262. https://doi.org/10.1109/TNNLS.2016.2535338

90 Kleyko D., Osipov E., Rachkovskij D. Modification of holographic graph neuron using sparse distributed representations. Procedia Computer Science. 2016. Vol. 88. P. 39–45. https://doi.org/10.1016/j.procs.2016.07.404

91 Kleyko D., Rahimi A., Rachkovskij D.A., Osipov E., Rabaey J.M. Classification and recall with binary hyperdimensional computing: trade-offs in choice of density and mapping characteristics (2017, Submitted).

92 Kleyko D., Rahimi A, Osipov E. Autoscaling Bloom Filter: controlling trade-off between true and false. arXiv:1705.03934. 10 May 2017

93 Kohonen T. Content-Addressable Memories. Berlin: Springer, 1987. 388 p. https://doi.org/10.1007/978-3-642-83056-3

94 Kohring G.A. Neural networks with many-neuron interactions. Journal de Physique. 1990. Vol. 51, N 2. P. 145–155. https://doi.org/10.1051/jphys:01990005102014500

95 Krotov D., Hopfield J.J. Dense associative memory for pattern recognition. Proc. NIPS’16. 2016. P. 1172–1180.

96 Krotov D., Hopfield J.J. Dense associative memory is robust to adversarial inputs. arXiv:1701.00939. 4 Jan 2017

97 Kryzhanovsky B.V., Mikaelian A.L., Fonarev A.B. Vector neural net identifing many strongly distorted and correlated patterns. Proc. SPIE. 2005. Vol. 5642. 124–133. https://doi.org/10.1117/12.572334

98 Kussul E. M. Associative neuron-like structures. Kiev: Naukova Dumka, 1992.

99 Kussul E. M., Baidyk T. N. A modular structure of associative-projective neural networks. Preprint 93-6. Kiev, Ukraine: GIC, 1993.

100 Kussul E.M., Fedoseyeva T.V. On audio signals recognition in neural assembly structures. Preprint 87-28. Kiev: Inst. of Cybern. 1987. 21 pp.

101 Kussul E., Makeyev O., Baidyk T., Calderon Reyes D. Neural network with ensembles. Proc. IJCNN’10. 2010. P. 2955–2961. https://doi.org/10.1109/IJCNN.2010.5596574

102 Kussul E.M., Rachkovskij D.A. Multilevel assembly neural architecture and processing of sequences. In Neurocomputers and Attention, V. II: Connectionism and neurocomputers. Manchester and New York: Manchester University Press, 1991. P.577–590.

103 Kussul E.M., Rachkovskij D.A., Wunsch D.C. The random subspace coarse coding scheme for real-valued vectors. Proc. IJCNN’99. 1999. P. 450-455. https://doi.org/10.1109/IJCNN.1999.831537

104 Lansner A. Associative memory models: From the cell assembly theory to biophysically detailed cortex simulations. Trends in Neurosciences. 2009. Vol. 32, N 3. P. 178–186. https://doi.org/10.1016/j.tins.2008.12.002

105 Lansner A., Ekeberg O. Reliability and speed of recall in an associative network. IEEE Trans. on Pattern Analysis and Machine Intelligence. 1985. Vol. 7. P. 490–498. https://doi.org/10.1109/TPAMI.1985.4767688

106 Levy S. D., Gayler R. Vector Symbolic Architectures: A new building material for artificial general intelligence. Proc. AGI’08. 2008. P. 414–418.

107 Lowe M. On the storage capacity of Hopfield models with correlated patterns. The Annals of Applied Probability. 1998. Vol. 8, N 4. P. 1216–1250. https://doi.org/10.1214/aoap/1028903378

108 Lowe M., Vermet F. The storage capacity of the Hopfield model and moderate deviations. Statistics and Probability Letters. 2005. Vol. 75. P. 237–248. https://doi.org/10.1016/j.spl.2005.06.001

109 Lowe M., Vermet F. The capacity of q-state Potts neural networks with parallel retrieval dynamics. Statistics and Probability Letters. 2007. Vol. 77, N 4. P. 1505–1514. https://doi.org/10.1016/j.spl.2007.03.030

110 Mazumdar A., Rawat A.S. Associative memory via a sparse recovery model. Proc. NIPS’15. 2015. P. 2683–2691.

111 Mazumdar A., Rawat A.S. Associative memory using dictionary learning and expander decoding. Proc. AAAI’17. 2017.

112 McEliece R.J., Posner E.C., Rodemich E.R., Venkatesh S.S. The capacity of the Hopfield associative memory. IEEE Trans. Information Theory. 1987. Vol. 33, N 4. P. 461–482. https://doi.org/10.1109/TIT.1987.1057328

113 Misuno I.S., Rachkovskij D.A., Slipchenko S.V. Vector and distributed representations reflecting semantic relatedness of words. Math. machines and systems. 2005. N 3. P. 50–67.

114 Misuno I.S., Rachkovskij D.A., Slipchenko S.V., Sokolov A.M. Searching for text information with the help of vector representations. Probl. Progr. 2005. N 4. P. 50–59.

115 Nadal J.-P. Associative memory: on the (puzzling) sparse coding limit. J. Phys. A. 1991. Vol. 24. P. 1093–1101. https://doi.org/10.1088/0305-4470/24/5/023

116 Norouzi M., Punjani A., Fleet D. J. Fast exact search in Hamming space with multi-index hashing. IEEE Trans. PAMI. 2014. Vol. 36, N 6. P. 1107–1119. https://doi.org/10.1109/TPAMI.2013.231

117 Onizawa N., Jarollahi H., Hanyu T., Gross W.J. Hardware implementation of associative memories based on multiple-valued sparse clustered networks. IEEE Journal on Emerging and Selected Topics in Circuits and Systems. 2016. Vol. 6, N 1. P. 13–24. https://doi.org/10.1109/JETCAS.2016.2528721

118 Palm G. On associative memory. Biological Cybernetics. 1980. Vol. 36. P. 19–31. https://doi.org/10.1007/BF00337019

119 Palm G. Memory capacity of local rules for synaptic modification. Concepts in Neuroscience. 1991. Vol. 2, N 1. P. 97–128.

120 Palm G. Neural associative memories and sparse coding. Neural Networks. 2013. Vol. 37. P. 165–171. https://doi.org/10.1016/j.neunet.2012.08.013

121 Palm G., Knoblauch A., Hauser F., Schuz A. Cell assemblies in the cerebral cortex. Biol. Cybern. 2014. Vol. 108, N 5. P. 559–572 https://doi.org/10.1007/s00422-014-0596-4

122 Palm G., Sommer F. T. Information capacity in recurrent Mc.Culloch-Pitts networks with sparsely coded memory states. Network. 1992, Vol. 3, P. 177–186. https://doi.org/10.1088/0954-898X_3_2_006

123 Parga N., Virasoro M.A. The ultrametric organization of memories in a neural network. J. Mod. Phys. 1986. Vol. 47, N 11. P. 1857–1864. https://doi.org/10.1142/9789812799371_0047
https://doi.org/10.1051/jphys:0198600470110185700

124 Peretto P., Niez J.J. Long term memory storage capacity of multiconnected neural networks. Biol. Cybern. 1986. Vol. 54, N 1. P. 53–63. https://doi.org/10.1007/BF00337115

125 Personnaz L., Guyon I., Dreyfus G. Collective computational properties of neural networks: New learning mechanisms. Phys. Rev. A. 1986. Vol. 34, N 5. P. 4217–4228. https://doi.org/10.1103/PhysRevA.34.4217

126 Plate T. Holographic reduced representation: Distributed representation for cognitive structures. Stanford: CSLI Publications, 2003. 300 p.

127 Rachkovskij D.A. Representation and processing of structures with binary sparse distributed codes. IEEE Transactions on Knowledge and Data Engineering. 2001. Vol. 13, N 2. P. 261–276. https://doi.org/10.1109/69.917565

128 Rachkovskij D.A. Some approaches to analogical mapping with structure sensitive distributed representations. Journal of Experimental and Theoretical Artificial Intelligence. 2004. Vol. 16, N 3. P. 125–145. https://doi.org/10.1080/09528130410001712862

129 Rachkovskij D.A. Formation of similarity-reflecting binary vectors with random binary projections. Cybernetics and Systems Analysis. 2015. Vol. 51, N 2. P. 313–323. https://doi.org/10.1007/s10559-015-9723-z

130 Rachkovskij D.A. Estimation of vectors similarity by their randomized binary projections. Cybernetics and Systems Analysis. 2015. Vol. 51, N 5. P. 808–818. https://doi.org/10.1007/s10559-015-9774-1

131 Rachkovskij D.A. Binary vectors for fast distance and similarity estimation. Cybernetics and Systems Analysis. 2017. Vol. 53, N 1. P. 138–156. https://doi.org/10.1007/s10559-017-9914-x

132 Rachkovskij D.A. Distance-based index structures for fast similarity search. Cybernetics and Systems Analysis. 2017. Vol. 53, N 4. https://doi.org/10.1007/s10559-017-9966-y

133 Rachkovskij D.A. Index structures for fast similarity search of binary vectors. Cybernetics and Systems Analysis. 2017. Vol. 53, N 5. https://doi.org/10.1007/s10559-017-9983-x

134 Rachkovskij D.A., Kussul E.M., Baidyk T.N. Building a world model with structure-sensitive sparse binary distributed representations. BICA. 2013. Vol. 3. P. 64–86. https://doi.org/10.1016/j.bica.2012.09.004

135 Rachkovskij D.A., Misuno I.S., Slipchenko S.V. Randomized projective methods for construction of binary sparse vector representations. Cybernetics and Systems Analysis. 2012. Vol. 48, N 1. P. 146–156. https://doi.org/10.1007/s10559-012-9384-0

136 Rachkovskij D.A., Slipchenko S.V. Similarity-based retrieval with structure-sensitive sparse binary distributed representations. Computational Intelligence. 2012. Vol. 28, N 1. P. 106–129. https://doi.org/10.1111/j.1467-8640.2011.00423.x

137 Rachkovskij D.A., Slipchenko S.V., Kussul E.M., Baidyk T.N. Sparse binary distributed encoding of scalars. J. of Automation and Inf. Sci. 2005. Vol. 37, N 6. P. 12–23. https://doi.org/10.1615/J
Automat Inf Scien.v37.i6.20

138 Rachkovskij D.A., Slipchenko S.V., Kussul E.M., Baidyk T.N. Properties of numeric codes for the scheme of random subspaces RSC. Cybernetics and Systems Analysis. 2005. 41, N 4. P. 509–520. https://doi.org/10.1007/s10559-005-0086-8

139 Rachkovskij D.A., Slipchenko S.V., Misuno I.S., Kussul E.M., Baidyk T.N. Sparse binary distributed encoding of numeric vectors. J. of Automation and Inf. Sci. 2005. Vol. 37, N 11. P. 47–61. https://doi.org/10.1615/J
Automat Inf Scien.v37.i11.60

140 Reznik A.M., Sitchov A.S., Dekhtyarenko O.K., Nowicki D.W. Associative memories with killed neurons: the methods of recovery. Proc. IJCNN’03. 2003. P. 2579–2582. https://doi.org/10.1109/IJCNN.2003.1223972

141 Rizzuto D.S., Kahana M.J. An autoassociative neural network model of paired-associate learning. Neural Computation. 2001. Vol. 13. P. 2075–2092. https://doi.org/10.1162/089976601750399317

142 Rolls E. T. Advantages of dilution in the connectivity of attractor networks in the brain. Biologically Inspired Cognitive Architectures. 2012. Vol. 1, P. 44–54. https://doi.org/10.1016/j.bica.2012.03.003

143 Romani S., Pinkoviezky I., Rubin A., Tsodyks M. Scaling laws of associative memory retrieval. Neural Computation. 2013. Vol. 25, N 10. P. 2523–2544. https://doi.org/10.1162/NECO_a_00499

144 Rosenfeld R., Touretzky D.S. Coarse-coded symbol memories and their properties. Complex Systems. 1988. Vol. 2, N 4. P. 463–484.

145 Salavati A.H., Kumar K.R., Shokrollahi A. Nonbinary associative memory with exponential pattern retrieval capacity and iterative learning. IEEE Trans. Neural Networks and Learning Systems. 2014. Vol. 25, N 3. P. 557–570. https://doi.org/10.1109/TNNLS.2013.2277608

146 Schwenker F., Sommer F.T., Palm G. Iterative retrieval of sparsely coded associative memory patterns. Neural Networks. 1996. Vol. 9. P. 445-455. https://doi.org/10.1016/0893-6080(95)00112-3

147 Shrivastava A., Li P. In defense of minhash over simhash. Proc. AISTATS’14. 2014. P. 886–894.

148 Slipchenko S.V., Rachkovskij D.A. Analogical mapping using similarity of binary distributed representations. International Journal Information Theories and Applications. 2009. Vol. 16, N 3. P. 269–290.

149 KO5. Tarkoma S., Rothenberg C. E., Lagerspetz E. Theory and Practice of Bloom Filters for Distributed Systems. IEEE Communications Surveys and Tutorials. 2012. Vol. 14, N 1. P. 131–155. https://doi.org/10.1109/SURV.2011.031611.00024

150 Tsodyks M. Associative memory in asymmetric diluted network with low level of activity. Europhysics Letters. 1988. Vol. 7, N 3. 203–208. https://doi.org/10.1209/0295-5075/7/3/003

151 Tsodyks M. Associative memory in neural networks with the hebbian learning rule. Modern Physics Letters B. 1989. Vol. 3, N 7. P. 555–560. https://doi.org/10.1142/S021798498900087X

152 Tsodyks M.V. Associative memory in neural networks with binary synapses. Mod. Phys. Lett. 1990. Vol. B4. P. 713–716. https://doi.org/10.1142/S0217984990000891

153 Tsodyks M.V. Hierarchical associative memory in neural networks with low activity level. Modern Physics Letters B. 1990. Vol. 4, N 4. P. 259–265. https://doi.org/10.1142/S0217984990000325

154 Tsodyks M., Feigelman M. The enhanced storage capacity in neural networks with low activity level. Europhysics Letters. 1988. Vol. 6, N 2. P. 101–105. https://doi.org/10.1209/0295-5075/6/2/002

155 Vedenov A. A., Ezhov A.A., Knizhnikova L.A., Levchenko E.B. “Spurious memory” in model neural networks. Preprint IAE-4395/1. 1987. Moscow: KIAE.

156 Willshaw D. Holography, associative memory and inductive generalization. In Parallel Models of Associative Memory. Hillside: Lawrence Erlbaum Associates. 1981. P. 83–104.

157 Willshaw D. J., Buneman O. P., Longuet-Higgins H. C. Non-holographic associative memory. Nature. Vol. 1969. Vol. 222. P. 960–962. https://doi.org/10.1038/222960a0

158 Yang X., Vernitski A., Carrea L. An approximate dynamic programming approach for improving accuracy of lossy data compression by Bloom filters. European Journal of Operational Research. 2016. Vol. 252, N 3. P 985–994. https://doi.org/10.1016/j.ejor.2016.01.042

159 Yu C., Gripon V., Jiang X., Jegou H. Neural associative memories as accelerators for binary vector search. Proc. COGNITIVE’15. 2015. P. 85–89.

Recieved 15.04.2017