The Revival of Hopfield Networks: From Classic Models to Modern AI

A concise tour of contemporary Hopfield-network research with quick notes on influential papers. Updated: August 21, 2025.

Why Hopfield networks again?

Hopfield networks began as elegant models of associative memory, then re-emerged as powerful building blocks for modern machine learning. The works below chart that journey—from classical capacity analyses to architectures that behave like attention and scale to large memories.

Hopfield Networks in the Modern Era

Hopfield Networks is All You Need — Ramsauer et al., 2020

This paper reframes Hopfield networks with continuous states and an energy function that enables exponential storage capacity and fast retrieval. The resulting “modern Hopfield network” acts like a content-addressable attention mechanism, integrating cleanly into deep learning pipelines.

Associative Memory as Error Correction

Bipartite Expander Hopfield Networks as Self-Decoding High-Capacity Error Correcting Codes — Chaudhuri & Fiete, 2019

Chaudhuri and Fiete build Hopfield networks on bipartite expander graphs, showing they can function as robust, high-capacity error-correcting codes. The expander structure enables reliable self-decoding of stored patterns, tightly linking associative memory to coding-theoretic guarantees.

Expanding (and Understanding) Capacity

Storage Capacity of the Hopfield Network Associative Memory — Wu, Hu, Wu, Zhou & Du, 2012

This study revisits the fundamental question: how many patterns can a classical Hopfield network reliably store and recall? The authors analyze theoretical limits alongside simulations, highlighting how factors like correlation and sparsity impact practical capacity.

Dense Associative Memory

Dense Associative Memory for Pattern Recognition — Krotov & Hopfield, 2016

Krotov and Hopfield generalize beyond quadratic energies with higher-order interactions, yielding dense associative memory (DAM). DAMs improve robustness and recognition performance, offering a path beyond classical Hopfield limits toward richer, more selective attractor dynamics.

Hierarchical Memory Structures

Hierarchical Associative Memory — Krotov, 2021

This work organizes memories across multiple levels, enabling coarse-to-fine abstraction and compositional retrieval. Hierarchical structure brings associative memory closer to scalable, biologically inspired models of concept formation and generalization.

Memory at Scale in Biology and AI

The Large Associative Memory Problem in Neurobiology and Machine Learning — Krotov & Hopfield, 2020

Addressing how to store and retrieve vast numbers of patterns, the authors synthesize perspectives from cortex and machine learning. They argue that modern associative mechanisms can bridge biological plausibility with the engineering demands of large-scale memory.

Modern Hopfield Networks in Vision–Language

Cloob: Modern Hopfield Networks with InfoLOOB Outperform CLIP — Fürst et al., 2022

Fürst and colleagues introduced Cloob, combining modern Hopfield networks with an InfoLOOB objective to train vision–language models. They show that Cloob surpasses CLIP in several benchmarks, illustrating how Hopfield-style associative memory can rival transformer-based multimodal learning.

Differentiable Clustering with Associative Memory

End-to-End Differentiable Clustering with Associative Memory

A framework that integrates associative memory mechanisms with clustering objectives in a fully differentiable, end-to-end pipeline. This approach potentially enables unsupervised pattern grouping while leveraging memory-based retrieval and representation capabilities.

*(Authors, venue, year, and link TBD.)*


References

Show BibTeX
@article{ramsauer2020hopfield,
  title={Hopfield networks is all you need},
  author={Ramsauer, Hubert and Sch{\"a}fl, Bernhard and Lehner, Johannes and Seidl, Philipp and Widrich, Michael and Adler, Thomas and Gruber, Lukas and Holzleitner, Markus and Pavlovi{\'c}, Milena and Sandve, Geir Kjetil and others},
  journal={arXiv preprint arXiv:2008.02217},
  year={2020}
}

@article{chaudhuri2019bipartite,
  title={Bipartite expander Hopfield networks as self-decoding high-capacity error correcting codes},
  author={Chaudhuri, Rishidev and Fiete, Ila},
  journal={Advances in neural information processing systems},
  volume={32},
  year={2019}
}

@article{krotov2016dense,
  title={Dense associative memory for pattern recognition},
  author={Krotov, Dmitry and Hopfield, John J},
  journal={Advances in neural information processing systems},
  volume={29},
  year={2016}
}

@inproceedings{wu2012storage,
  title={Storage capacity of the Hopfield network associative memory},
  author={Wu, Yue and Hu, Jianqing and Wu, Wei and Zhou, Yong and Du, KL},
  booktitle={2012 Fifth International Conference on Intelligent Computation Technology and Automation},
  pages={330--336},
  year={2012},
  organization={IEEE}
}

@article{krotov2021hierarchical,
  title={Hierarchical associative memory},
  author={Krotov, Dmitry},
  journal={arXiv preprint arXiv:2107.06446},
  year={2021}
}

@article{krotov2020large,
  title={Large associative memory problem in neurobiology and machine learning},
  author={Krotov, Dmitry and Hopfield, John},
  journal={arXiv preprint arXiv:2008.06996},
  year={2020}
}

@article{furst2022cloob,
  title={Cloob: Modern hopfield networks with infoloob outperform clip},
  author={F{\"u}rst, Andreas and Rumetshofer, Elisabeth and Lehner, Johannes and Tran, Viet T and Tang, Fei and Ramsauer, Hubert and Kreil, David and Kopp, Michael and Klambauer, G{\"u}nter and Bitto, Angela and others},
  journal={Advances in neural information processing systems},
  volume={35},
  pages={20450--20468},
  year={2022}
}

@article{differentiable-clustering,
  title={End to end differentiable clustering with associative memory},
  author={TBD},
  journal={TBD},
  year={TBD}
}