Neural Associative Memories as Accelerators for Binary Vector Search
Associative memories aim at matching an input noisy vector with a stored one. The matched vector satisfies a minimum distance criterion with respect to the inner metric of the device. This problem of finding nearest neighbors in terms of Euclidean or Hamming distances is a very common operation in machine learning and pattern recognition. However, the inner metrics of associative memories are often misfitted to handle practical scenarios. In this paper, we adapt Willshaw networks in order to use them for accelerating nearest neighbor search with limited impact on accuracy. We provide a theoretical analysis of our method for binary sparse vectors. We also test our method using the MNIST handwritten digits database. Both our analysis for synthetic data and experiments with real-data evidence a significant gain in complexity with negligible loss in performance compared to exhaustive search.
Download manuscript.
Bibtex@inproceedings{YuGriJiaJé20153,
author = {Chendi Yu and Vincent Gripon and Xiaoran
Jiang and Hervé Jégou},
title = {Neural Associative Memories as Accelerators
for Binary Vector Search},
booktitle = {Proceedings of Cognitive},
year = {2015},
pages = {85--89},
month = {March},
}