DistiLlama  by shreyaskarnik

Chrome extension for local LLM web/document summarization and chat

created 1 year ago
295 stars

Top 90.7% on sourcepulse

GitHubView on GitHub
Project Summary

DistiLlama is a Chrome extension designed for users who want to privately summarize or chat with web pages and local documents using locally running Large Language Models (LLMs). It addresses the need for data privacy by keeping all LLM interactions and data on the user's machine, leveraging Ollama for local LLM execution.

How It Works

The extension utilizes Ollama as the backend for running LLMs locally. It extracts text content from the active browser tab using the Readability library to ensure cleaner, more relevant text. This extracted text is then processed by LangChain.js to perform summarization or chat functionalities. The results are displayed in a user-friendly popup window within the Chrome extension.

Quick Start & Requirements

  • Install Ollama: Download and install Ollama, or run it via Docker. Start Ollama with OLLAMA_ORIGINS=* OLLAMA_HOST=127.0.0.1:11435 ollama serve.
  • Pull Models: Use ollama pull llama2:latest or ollama pull mistral:latest (or other supported models from https://ollama.ai/library).
  • Clone & Install: Clone the repository and run pnpm install.
  • Load Extension: Enable developer mode in chrome://extensions/, then load the dist folder as an unpacked extension.
  • Prerequisites: Node.js (for pnpm), Ollama.

Highlighted Details

  • Supports local LLM summarization and chat with web pages/documents.
  • Prioritizes user data privacy by keeping all processing local.
  • Integrates with Ollama for easy local LLM setup and management.
  • Uses Readability for improved text extraction quality.

Maintenance & Community

The project is maintained by shreyaskarnik. Further community engagement details are not specified in the README.

Licensing & Compatibility

The license is not explicitly stated in the README. Compatibility for commercial use or closed-source linking is not detailed.

Limitations & Caveats

The UI is noted as a work in progress. Features like configurable summarization chains, saving summaries, TTS support, and advanced prompt tuning are listed as future enhancements. The project is actively being developed with several "TODO" items.

Health Check
Last commit

11 months ago

Responsiveness

1 day

Pull Requests (30d)
0
Issues (30d)
0
Star History
3 stars in the last 90 days

Explore Similar Projects

Starred by Chip Huyen Chip Huyen(Author of AI Engineering, Designing Machine Learning Systems), Simon Willison Simon Willison(Author of Django), and
1 more.

Lumos by andrewnguonly

0.1%
2k
Chrome extension for local LLM web RAG co-piloting
created 1 year ago
updated 6 months ago
Feedback? Help us improve.