Library for neural search model fine-tuning and efficient inference
Top 78.7% on sourcepulse
Neural-Cherche is a Python library for fine-tuning and deploying neural search models like Splade, ColBERT, and SparseEmbed. It targets researchers and developers needing to adapt state-of-the-art retrieval systems to specific datasets for improved performance in offline and online applications. The library simplifies the process of training, inference, and embedding management.
How It Works
Neural-Cherche facilitates fine-tuning using a triplet loss approach (anchor, positive, negative) on datasets formatted as tuples. It supports ColBERT fine-tuning from any Sentence Transformer checkpoint and Splade/SparseEmbed from MLM pre-trained models. The library also provides efficient inference classes for both retrieval and ranking stages, enabling users to build hybrid search systems. It allows saving computed embeddings to avoid redundant calculations.
Quick Start & Requirements
pip install neural-cherche
or pip install "neural-cherche[eval]"
for evaluation during training.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The Splade model is restricted to non-commercial use, which may impact its applicability in certain enterprise environments. Fine-tuning Splade and SparseEmbed requires MLM pre-trained models, adding a dependency on specific model architectures.
4 months ago
Inactive