Embedded vector store for local-first AI applications
A single-file vector database written in Rust. Dense + sparse hybrid search, HNSW indexing, transactions, and crash-safe persistence — all in a portable .vdb file.
Dense vectors with HNSW + sparse BM25 keyword retrieval, fused with linear combination or RRF.
Everything in one portable .vdb file. Crash-safe WAL, file locking, snapshots and backup.
MongoDB-style operators: $eq, $gt, $in, $contains, $exists, with nested dot-path access.
Atomic batched writes with rollback. Context manager support in Python, try/catch in Node.js.
Native Rust core with Python (PyO3) and Node.js (napi-rs) bindings. Swift and Kotlin planned.
No server, no Docker, no network calls. Just import and use. Works offline.
import vectlite
# Open or create a database
db = vectlite.open("knowledge.vdb", dimension=384)
# Insert vectors with metadata
db.upsert("doc1", embedding, {"source": "blog", "title": "Auth Guide"})
db.upsert("doc2", embedding2, {"source": "notes", "title": "Billing"})
# Hybrid search: dense + sparse
results = db.search(
query_embedding,
k=10,
sparse=vectlite.sparse_terms("auth guide"),
filter={"source": "blog"},
fusion="rrf",
)
for r in results:
print(r["id"], r["score"])
db.compact()