AI-powered answer engine using open-source LLMs
Top 68.0% on sourcepulse
Sensei Search is an AI-powered answer engine designed to provide users with synthesized information, similar to Perplexity AI. It targets users seeking an open-source alternative for AI-driven search and research, leveraging various open-source LLMs for enhanced capabilities.
How It Works
Sensei Search utilizes a hybrid approach combining a Next.js/Tailwind CSS frontend with a FastAPI backend. It integrates with multiple LLMs, including Command-R, Qwen-2-72b-instruct, and WizardLM-2 8x22B, alongside commercial options like Claude Haiku and GPT-3.5-turbo. Search functionality is powered by SearxNG and Bing, with Redis used for memory management. This architecture allows for flexibility in model selection and search source integration.
Quick Start & Requirements
cd sensei_root_folder/ && docker compose up
pip install paka
, make provision-prod
, make deploy-backend
, make deploy-frontend
.Highlighted Details
paka
.Maintenance & Community
The project is maintained by jjleng. Further community or maintenance details are not explicitly provided in the README.
Licensing & Compatibility
The README does not specify a license. Compatibility for commercial use or closed-source linking is not detailed.
Limitations & Caveats
Running the larger open-source LLMs locally requires significant GPU resources. Cloud deployment necessitates AWS setup and Hugging Face token usage, implying potential costs and vendor lock-in. The project appears to be in active development, with no explicit stability guarantees mentioned.
9 months ago
1 day