Search Results

You are looking at 1 - 2 of 2 items for

  • Author: Nobuyuki Matsui x
Clear All Modify Search
Open access

Toshifumi Minemoto, Teijiro Isokawa, Haruhiko Nishimura and Nobuyuki Matsui

Abstract

Hebbian learning rule is well known as a memory storing scheme for associative memory models. This scheme is simple and fast, however, its performance gets decreased when memory patterns are not orthogonal each other. Pseudo-orthogonalization is a decorrelating method for memory patterns which uses XNOR masking between the memory patterns and randomly generated patterns. By a combination of this method and Hebbian learning rule, storage capacity of associative memory concerning non-orthogonal patterns is improved without high computational cost. The memory patterns can also be retrieved based on a simulated annealing method by using an external stimulus pattern. By utilizing complex numbers and quaternions, we can extend the pseudo-orthogonalization for complex-valued and quaternionic Hopfield neural networks. In this paper, the extended pseudo-orthogonalization methods for associative memories based on complex numbers and quaternions are examined from the viewpoint of correlations in memory patterns. We show that the method has stable recall performance on highly correlated memory patterns compared to the conventional real-valued method.

Open access

Teijiro Isokawa, Hiroki Yamamoto, Haruhiko Nishimura, Takayuki Yumoto, Naotake Kamiura and Nobuyuki Matsui

Abstract

In this paper, we investigate the stability of patterns embedded as the associative memory distributed on the complex-valued Hopfield neural network, in which the neuron states are encoded by the phase values on a unit circle of complex plane. As learning schemes for embedding patterns onto the network, projection rule and iterative learning rule are formally expanded to the complex-valued case. The retrieval of patterns embedded by iterative learning rule is demonstrated and the stability for embedded patterns is quantitatively investigated.