Chrome extension for local LLM web/document summarization and chat
Top 90.7% on sourcepulse
DistiLlama is a Chrome extension designed for users who want to privately summarize or chat with web pages and local documents using locally running Large Language Models (LLMs). It addresses the need for data privacy by keeping all LLM interactions and data on the user's machine, leveraging Ollama for local LLM execution.
How It Works
The extension utilizes Ollama as the backend for running LLMs locally. It extracts text content from the active browser tab using the Readability library to ensure cleaner, more relevant text. This extracted text is then processed by LangChain.js to perform summarization or chat functionalities. The results are displayed in a user-friendly popup window within the Chrome extension.
Quick Start & Requirements
OLLAMA_ORIGINS=* OLLAMA_HOST=127.0.0.1:11435 ollama serve
.ollama pull llama2:latest
or ollama pull mistral:latest
(or other supported models from https://ollama.ai/library).pnpm install
.chrome://extensions/
, then load the dist
folder as an unpacked extension.Highlighted Details
Maintenance & Community
The project is maintained by shreyaskarnik. Further community engagement details are not specified in the README.
Licensing & Compatibility
The license is not explicitly stated in the README. Compatibility for commercial use or closed-source linking is not detailed.
Limitations & Caveats
The UI is noted as a work in progress. Features like configurable summarization chains, saving summaries, TTS support, and advanced prompt tuning are listed as future enhancements. The project is actively being developed with several "TODO" items.
11 months ago
1 day