ParlayANN: Scalable and Deterministic Parallel Graph-Based Approximate Nearest Neighbor Search Algorithms

被引:5
作者
Manohar, Magdalen Dobson [1 ]
Shen, Zheqi [2 ]
Blelloch, Guy E. [1 ]
Dhulipala, Laxman [3 ]
Gu, Yan [2 ]
Simhadri, Harsha Vardhan [4 ]
Sun, Yihan [2 ]
机构
[1] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
[2] UC Riverside, Riverside, CA USA
[3] Univ Maryland, Baltimore, MD USA
[4] Microsoft Res, Redmond, WA USA
来源
PROCEEDINGS OF THE 29TH ACM SIGPLAN ANNUAL SYMPOSIUM ON PRINCIPLES AND PRACTICE OF PARALLEL PROGRAMMING, PPOPP 2024 | 2024年
关键词
nearest neighbor search; vector search; parallel; algorithms; SIMILARITY SEARCH;
D O I
10.1145/3627535.3638475
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Approximate nearest-neighbor search (ANNS) algorithms are a key part of the modern deep learning stack due to enabling efficient similarity search over high-dimensional vector space representations (i.e., embeddings) of data. Among various ANNS algorithms, graph-based algorithms are known to achieve the best throughput-recall tradeoffs. Despite the large scale of modern ANNS datasets, existing parallel graphbased implementations suffer from significant challenges to scale to large datasets due to heavy use of locks and other sequential bottlenecks, which 1) prevents them from efficiently scaling to a large number of processors, and 2) results in nondeterminism that is undesirable in certain applications. In this paper, we introduce ParlayANN, a library of deterministic and parallel graph-based approximate nearest neighbor search algorithms, along with a set of useful tools for developing such algorithms. In this library, we develop novel parallel implementations for four state-of-the-art graph-based ANNS algorithms that scale to billion-scale datasets. Our algorithms are deterministic and achieve high scalability across a diverse set of challenging datasets. In addition to the new algorithmic ideas, we also conduct a detailed experimental study of our new algorithms as well as two existing non-graph approaches. Our experimental results both validate the effectiveness of our new techniques, and lead to a comprehensive comparison among ANNS algorithms on large scale datasets with a list of interesting findings.
引用
收藏
页码:270 / 285
页数:16
相关论文
共 74 条
[1]  
Andoni A, 2015, ADV NEUR IN, V28
[2]  
ANN Benchmarks Authors, 2023, ANN-Benchmarks
[3]  
[Anonymous], 2023, CHATGPT-Retrieval-plugin/readme.md
[4]  
[Anonymous], 2016, KGraph: A Library for Approximate Nearest Neighbor Search
[5]  
[Anonymous], 2023, Microsoft Bing Search Engine
[6]  
[Anonymous], 2021, MIDS-LVT Terminals
[7]  
[Anonymous], 2023, Pinecone: Vector Database for Vector Search
[8]  
[Anonymous], 2022, OpenSearch k-NN. Webpage
[9]  
[Anonymous], 2022, Webpage
[10]  
[Anonymous], 2023, Apache Lucene