Automated research assistant using local LLMs
Top 17.0% on sourcepulse
This Python program transforms a locally hosted LLM via Ollama into an automated web researcher. It's designed for users needing in-depth, structured online research without manual intervention, systematically breaking down queries, searching the web, scraping content, and compiling findings into a documented trail.
How It Works
The tool takes a user query, decomposes it into prioritized research areas, and iteratively searches the web, scrapes relevant content, and extracts source URLs. It dynamically generates new research avenues based on discovered information, allowing continuous research until manually stopped. Upon termination, the LLM synthesizes all collected data into a comprehensive summary and enters a conversational mode for follow-up questions.
Quick Start & Requirements
pip install -r requirements.txt
phi3:3.8b-mini-128k-instruct
or phi3:14b-medium-128k-instruct
with custom context length).llm_config.py
.python Web-LLM.py
@
followed by your query and press CTRL+D
.Highlighted Details
Maintenance & Community
This is a prototype project by a new programmer. Contributions and suggestions are welcomed.
Licensing & Compatibility
Licensed under the MIT License. Compatible with commercial use and closed-source linking.
Limitations & Caveats
The project is described as a prototype and is still in development. While functional, it may have room for improvements and new features. Windows users should refer to the /feature/windows-support
branch for specific instructions.
7 months ago
1 day