Obsidian plugin for LLM integration
Top 10.1% on sourcepulse
This plugin integrates Large Language Models (LLMs) into Obsidian, enabling users to interact with their notes via chat or a "Vault QA" mode. It targets Obsidian users seeking to enhance their personal knowledge management (PKM) with AI, offering features like custom prompts, note summarization, and AI agents for advanced workflows.
How It Works
The plugin leverages a local indexing approach for its "Vault QA" feature, ensuring user data remains on their device. It supports a wide array of LLM providers, including OpenAI, Azure, Google Gemini, Cohere, and Groq, as well as local models via LM Studio and Ollama. Users can connect using their own API keys, providing flexibility and cost control. The "Copilot Plus" feature introduces agentic capabilities, aiming to replicate tools like Cursor but optimized for PKM.
Quick Start & Requirements
main.js
, manifest.json
, and styles.css
to .obsidian/plugins/obsidian-copilot/
.Highlighted Details
Maintenance & Community
The project is actively developed with frequent updates, including recent revamps to settings and new features like inline editing and relevant notes in chat. The developer, logancyang, is active, and several contributors are highlighted for their work. The developer can be found on Twitter/X at @logancyang.
Licensing & Compatibility
The frontend code of the plugin is open-source. However, the backend code for "Copilot Plus" AI agents is proprietary and closed-source. A refund policy is offered for Copilot Plus.
Limitations & Caveats
"Copilot Plus" is a premium, paid product with a closed-source backend, requiring network access and collecting server-side telemetry. Users must manage their own API keys and be mindful of potential API costs. Some Azure integration details can be complex to configure.
1 day ago
1 day