Neural associative memory (AM) is one of the critical building blocks for cognitive computing systems. It memorizes (learns) and retrieves input data by information content itself. One of the key challenges of designing AM for intelligent devices is to expand memory capacity while using a minimal amount of hardware and energy resources. However, prior arts show that memory capacity increases slowly, i.e., in square root with the total number of synaptic weights. To tackle this problem, we propose a synapse model called recursive synaptic bit reuse, which enables near-linear scaling of memory capacity with total synaptic bits. Our model can also handle input data that are correlated more robustly than the conventional model. We evaluated our model in the context of Hopfield neural networks (HNNs) that contain 5–327-KB data storage for synaptic weights. Our model can increase the memory capacity of HNNs as large as 30× over the conventional ones. The very large scale integration implementation of HNNs in 65 nm confirms that our proposed model can save up to 19× area and up to 232× energy dissipation as compared to the conventional model. These savings are expected to grow with the network size.
Memory capacity is increase.Create a multiple Hopfield Neural Network.