Local search aggregator using LLM agents
Top 8.8% on sourcepulse
LLocalSearch is a privacy-focused, locally-run search aggregator that uses LLM agents to find answers to user queries without relying on external APIs. It's designed for users who want to avoid data manipulation by commercial search engines and prefer a transparent, auditable search process.
How It Works
The system employs a chain of locally hosted Large Language Models (LLMs) that can interact with a suite of tools, including internet search capabilities. This recursive process allows the LLM to dynamically select and utilize tools, even multiple times, based on the evolving context of the user's query and intermediate tool results. This approach enables the LLM to gather current information and construct comprehensive answers.
Quick Start & Requirements
docker-compose up -d
..env
file for configuration.Highlighted Details
Maintenance & Community
The project has not been under active development for over a year, with the author working on a private beta rewrite. Interested users are encouraged to contact the author for participation.
Licensing & Compatibility
The repository does not explicitly state a license in the provided README.
Limitations & Caveats
The project is noted as not having been under active development for over a year, with a rewrite planned for a private beta. There are ongoing efforts to address compatibility issues with newer LLMs like Llama 3 regarding stop word handling.
3 months ago
1 day