all AI news
Capacity of the Hebbian-Hopfield network associative memory
March 5, 2024, 2:44 p.m. | Mihailo Stojnic
cs.LG updates on arXiv.org arxiv.org
Abstract: In \cite{Hop82}, Hopfield introduced a \emph{Hebbian} learning rule based neural network model and suggested how it can efficiently operate as an associative memory. Studying random binary patterns, he also uncovered that, if a small fraction of errors is tolerated in the stored patterns retrieval, the capacity of the network (maximal number of memorized patterns, $m$) scales linearly with each pattern's size, $n$. Moreover, he famously predicted $\alpha_c=\lim_{n\rightarrow\infty}\frac{m}{n}\approx 0.14$. We study this very same scenario with …
abstract arxiv binary capacity cond-mat.dis-nn cs.it cs.lg errors math.it math.pr memory network neural network patterns random retrieval small stat.ml studying type
More from arxiv.org / cs.LG updates on arXiv.org
Trainwreck: A damaging adversarial attack on image classifiers
1 day, 10 hours ago |
arxiv.org
Fast Controllable Diffusion Models for Undersampled MRI Reconstruction
1 day, 10 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
Software Engineer III -Full Stack Developer - ModelOps, MLOps
@ JPMorgan Chase & Co. | NY, United States
Senior Lead Software Engineer - Full Stack Senior Developer - ModelOps, MLOps
@ JPMorgan Chase & Co. | NY, United States
Software Engineer III - Full Stack Developer - ModelOps, MLOps
@ JPMorgan Chase & Co. | NY, United States
Research Scientist (m/w/d) - Numerische Simulation Laser-Materie-Wechselwirkung
@ Fraunhofer-Gesellschaft | Freiburg, DE, 79104
Research Scientist, Speech Real-Time Dialog
@ Google | Mountain View, CA, USA