Pseudo-Orthogonalization of Memory Patterns for Complex-Valued and Quaternionic Associative Memories

Open access

Abstract

Hebbian learning rule is well known as a memory storing scheme for associative memory models. This scheme is simple and fast, however, its performance gets decreased when memory patterns are not orthogonal each other. Pseudo-orthogonalization is a decorrelating method for memory patterns which uses XNOR masking between the memory patterns and randomly generated patterns. By a combination of this method and Hebbian learning rule, storage capacity of associative memory concerning non-orthogonal patterns is improved without high computational cost. The memory patterns can also be retrieved based on a simulated annealing method by using an external stimulus pattern. By utilizing complex numbers and quaternions, we can extend the pseudo-orthogonalization for complex-valued and quaternionic Hopfield neural networks. In this paper, the extended pseudo-orthogonalization methods for associative memories based on complex numbers and quaternions are examined from the viewpoint of correlations in memory patterns. We show that the method has stable recall performance on highly correlated memory patterns compared to the conventional real-valued method.

[1] J. J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proceedings of the National Academy of Sciences of the United States of America, vol. 79, no. 8, pp. 2554-2558, 1982.

[2] L. Personnaz, I. Guyon, and G. Dreyfus, Collective computational properties of neural networks: New learning mechanisms, Physical Review A, vol. 34, no. 5, p. 4217, 1986.

[3] S. Diederich and M. Opper, Learning of correlated patterns in spin-glass networks by local learning rules, Physical Review Letters, vol. 58, pp. 949-952, 1987.

[4] M. Oku, T. Makino, and K. Aihara, Pseudoorthogonalization of memory patterns for associative memory, IEEE Transactions on Neural Networks and Learning Systems, vol. 24(11), pp. 1877-1887, 2013.

[5] A. Hirose, Ed., Complex-valued neural networks: Advances and applications, Wiley-IEEE Press, 2013.

[6] T. Nitta, Ed., Complex-Valued Neural Networks: Utilizing High-Dimensional Parameters, Information Science Reference, 2009.

[7] T. Minemoto, T. Isokawa, H. Nishimura, and N.Matsui, Utilizing High-Dimensional Neural Networks for Pseudo-orthogonalization of Memory Patterns, in Proceedings of 21st International Conference on Neural Information Processing (ICONIP2014), 2014, pp. 527-534.

[8] S. Jankowski, A. Lozowski, and J. M. Zurada, Complex-Valued Multistate Neural Associative Memory, IEEE Transactions on Neural Networks, vol. 7, no. 6, pp. 1491-1496, 1996.

[9] I. Aizenberg, Complex-valued neural networks with multi-valued neurons, Springer, 2011.

[10] W. R. Hamilton, Lectures on Quaternions, Hodges and Smith, 1853.

[11] T. Isokawa, H. Nishimura, N. Kamiura, and N. Matsui, Associative memory in quaternionic hopfield neural network, International Journal of Neural Systems, vol. 18, no. 02, pp. 135-145, 2008.

[12] T. Isokawa, N. Matsui, and H. Nishimura, Quaternionic neural networks for associative memories, in Complex-Valued Neural Networks: Advances and Applications, A. Hirose, Ed. Wiley-IEEE Press, 2013, ch. 5, pp. 103-132.

[13] T. Minemoto, T. Isokawa, H. Nishimura, and N. Matsui, Quaternionic multistate Hopfield neural network with extended projection rule, Artificial Life and Robotics, vol. 21, no. 1, pp. 106-111, 2016.

Journal of Artificial Intelligence and Soft Computing Research

The Journal of Polish Neural Network Society, the University of Social Sciences in Lodz & Czestochowa University of Technology

Journal Information

CiteScore 2017: 5.00

SCImago Journal Rank (SJR) 2017: 0.492
Source Normalized Impact per Paper (SNIP) 2017: 2.813

Cited By

Metrics

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 122 122 17
PDF Downloads 42 42 5