local-deep-researcher  by langchain-ai

Local web research assistant using local LLMs

created 8 months ago
7,903 stars

Top 6.7% on sourcepulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

This project provides a fully local web research and report writing assistant for users who want to leverage their own LLMs. It automates the process of generating search queries, gathering and summarizing web results, identifying knowledge gaps, and iteratively refining research to produce a comprehensive markdown report with cited sources.

How It Works

Inspired by IterDRAG, the assistant uses a local LLM (via Ollama or LMStudio) to generate web search queries based on a given topic. It then retrieves and summarizes relevant web content. The LLM reflects on the summary to identify knowledge gaps, generating new queries to address them. This iterative process repeats for a configurable number of cycles, progressively enriching the research.

Quick Start & Requirements

  • Install: Clone the repository, copy .env.example to .env, and configure environment variables.
  • LLM: Requires Ollama or LMStudio with a compatible local LLM (e.g., DeepSeek R1, Qwen).
  • Search: Defaults to DuckDuckGo; API keys for Tavily or Perplexity can be configured.
  • Launch: Use langgraph dev after installing dependencies (uvx or pip install -e . and pip install -U "langgraph-cli[inmem]").
  • Docs: LangChain Academy Module 6 for deployment.

Highlighted Details

  • Fully local operation, no external API calls for LLM processing.
  • Supports Ollama and LMStudio for local LLM hosting.
  • Iterative research process with knowledge gap identification.
  • Outputs a markdown report with source citations.
  • Integrates with LangGraph Studio for visualization and configuration.

Maintenance & Community

This project is part of the LangChain AI ecosystem. Further community and development details can be found on the LangChain GitHub and associated channels.

Licensing & Compatibility

The repository does not explicitly state a license in the README. Compatibility for commercial use or closed-source linking would require clarification of the licensing terms.

Limitations & Caveats

Some LLMs may struggle with structured JSON output required by the agent, though fallback mechanisms exist. Browser compatibility issues are noted for Safari users with LangGraph Studio UI. A TypeScript port is available but omits Perplexity search.

Health Check
Last commit

1 month ago

Responsiveness

1 day

Pull Requests (30d)
0
Issues (30d)
1
Star History
721 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.