Logseq plugin for local LLM integration via Ollama
Top 90.4% on sourcepulse
This Logseq plugin integrates with Ollama, enabling users to leverage local large language models (LLMs) directly within their note-taking workflow. It provides AI-powered assistance for tasks like summarization, content generation, and task decomposition, targeting Logseq users who want to enhance productivity with local AI capabilities.
How It Works
The plugin exposes several commands within Logseq, allowing users to interact with Ollama models. Key features include prompting the AI with or without page/block context, summarizing content, creating flashcards, and dividing tasks. It supports context menu commands, slash commands, and toolbar buttons for seamless integration. Advanced configuration allows per-block model selection via block properties and custom context commands through a dedicated Logseq page.
Quick Start & Requirements
ollama-logseq-config
page.Highlighted Details
ollama-generate-model
property.ollama-logseq-config
page.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The README does not specify the license, which may impact commercial use or closed-source integration. The plugin's functionality is dependent on a correctly configured local Ollama instance.
6 months ago
1 day