Go library for embedded vector search and semantic embeddings
Top 64.9% on sourcepulse
This Go library provides embedded vector search and semantic embeddings for small to medium-scale projects, targeting Go developers needing efficient semantic capabilities without complex infrastructure. It leverages llama.cpp and GGUF BERT models for accurate, fast, and lean semantic search with optional GPU acceleration.
How It Works
The library utilizes llama.cpp, accessed via purego to avoid cgo, enabling direct interaction with shared C libraries from Go. This simplifies integration and cross-compilation. It supports GGUF-formatted BERT models for generating text embeddings. For search, it implements a brute-force nearest neighbor approach with SIMD optimizations, suitable for datasets under 100,000 entries, and allows saving/loading search indexes.
Quick Start & Requirements
dist
directory for Windows and Linux. For other platforms or custom builds, compile from source.Highlighted Details
Maintenance & Community
The project is maintained by kelindar. Further community or roadmap information is not detailed in the README.
Licensing & Compatibility
The README does not explicitly state a license. Compatibility for commercial use or closed-source linking is not specified.
Limitations & Caveats
The brute-force search approach leads to performance bottlenecks on datasets exceeding 100,000 entries. It lacks advanced query features like multi-field filtering or fuzzy matching, and handling high-dimensional embeddings in real-time requires sufficient GPU resources.
1 month ago
Inactive