Skip to main content
  • Poster presentation
  • Open access
  • Published:

Novel local learning rule for neural adaptation fits Hopfield memory networks efficiently and optimally

We present an algorithm to store binary memories in a Little-Hopfield neural network using minimum probability flow, a recent technique to fit parameters in energy-based probabilistic models. For memories without noise, our algorithm provably achieves optimal pattern storage and outperforms classical methods both in speed and memory recovery. Moreover, when trained on noisy or corrupted versions of a fixed set of binary patterns, our algorithm finds networks which correctly store the originals. We also demonstrate this finding visually with the unsupervised storage and clean-up of large binary fingerprint images from significantly corrupted samples.

Background

In 1982, motivated by neural modeling work of [3] and the Ising spin glass model from statistical physics [2], Hopfield introduced a method for the storage and retrieval of binary patterns in an auto-associative neural-network [1]. However, existing techniques for training Little-Hopfield networks suffer either from limited pattern capacity or excessive training time, and they exhibit poor performance when trained on unlabeled, corrupted memories.

Results

Our main theoretical contributions here are the introduction of a tractable and neurally-plausible algorithm MPF for the optimal storage of patterns in a Little-Hopfield network, a proof that the capacity of such a network is at least one pattern per neuron, and a novel local learning rule for training neural networks. Our approach is inspired by minimum probability flow [4], which fits probabilistic models without computations with a partition function.

We also present several experimental results. Compared with standard techniques for Little-Hopfield pattern storage, our method is shown to be superior in efficiency and generalization (Figure 1). Another finding is that our algorithm can store many patterns in a Little-Hopfield network from highly corrupted (unlabeled) samples of them (Figure 2). We also store 64 × 64 binary images of human fingerprints from highly corrupted versions (Figure not shown).

Figure 1
figure 1

Performance. Comparison of our training algorithm (MPF) to classical methods Outer Product Rule (OPR) and Perceptron (PER).

Figure 2
figure 2

Noisy data storage. Shows fraction of patterns (red for MPF, blue for PER) and fraction of bits (dotted red for MPF, dotted blue for PER) recalled of trained networks (n = 64 nodes each) as a function of the number of patterns m to be stored. Training patterns were presented repeatedly with 20 bit corruption (i.e., 31% of the bits flipped).

References

  1. Hopfield JJ: Neural networks and physical systems with emergent collective computational abilities. Proc Natl Acad Sci U S A. 1982, 79: 2554-2558. 10.1073/pnas.79.8.2554.

    Article  PubMed Central  CAS  PubMed  Google Scholar 

  2. Ising E: Beitrag zur Theorie des Ferromagnetismus. Zeitschrift fur Physik. 1925, 31:

    Google Scholar 

  3. Little WA: The existence of persistent states in the brain. Math Biosciences. 1974, 19:

    Google Scholar 

  4. Sohl-Dickstein J, Battaglino P, DeWeese MR: Minimum probability flow learning. PRL. 2011

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chris Hillar.

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Hillar, C., Sohl-Dickstein, J. & Koepsell, K. Novel local learning rule for neural adaptation fits Hopfield memory networks efficiently and optimally. BMC Neurosci 14 (Suppl 1), P215 (2013). https://0-doi-org.brum.beds.ac.uk/10.1186/1471-2202-14-S1-P215

Download citation

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/1471-2202-14-S1-P215

Keywords