Maximum Likelihood Associative Memories
Associative memories are structures that store data in such a way that it can later be retrieved given only a part of its content -- a sort-of error/erasure-resilience property. They are used in applications ranging from caches and memory management in CPUs to database engines. In this work we study associative memories built on the maximum likelihood principle. We derive minimum residual error rates when the data stored comes from a uniform binary source. Second, we determine the minimum amount of memory required to store the same data. Finally, we bound the computational complexity for message retrieval. We then compare these bounds with two existing associative memory architectures: the celebrated Hopfield neural networks and a neural network architecture introduced more recently by Gripon and Berrou.
Download manuscript.
Bibtex@inproceedings{GriRab20139,
author = {Vincent Gripon and Michael Rabbat},
title = {Maximum Likelihood Associative Memories},
booktitle = {Proceedings of Information Theory
Workshop},
year = {2013},
pages = {1--5},
month = {September},
}