obsidian-ollama  by hinterdupfinger

Obsidian plugin for local LLM-powered text manipulation

created 1 year ago
960 stars

Top 39.1% on sourcepulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

This plugin enables Obsidian users to integrate local Large Language Models (LLMs) via Ollama directly into their note-taking workflow. It offers pre-configured prompts for common text manipulation tasks and allows custom prompt creation, enhancing productivity for researchers and writers.

How It Works

The plugin sends user-defined prompts, along with either selected text or the entire note content, to a locally running Ollama instance. Ollama processes the request using a specified model and temperature, returning the generated text which the plugin then inserts into the Obsidian note at the cursor's current position. This approach leverages local LLMs for privacy and offline capabilities.

Quick Start & Requirements

  • Install via Obsidian's community plugin browser.
  • Requires a local Ollama installation (e.g., macOS app).
  • Connects to Ollama by default at http://localhost:11434.

Highlighted Details

  • Pre-configured prompts for summarization, explanation, expansion, and rewriting.
  • Supports custom prompts with user-defined models and temperature settings.
  • Inserts LLM output directly into the Obsidian note at the cursor.

Maintenance & Community

No specific community links or contributor details are provided in the README.

Licensing & Compatibility

The license is not specified in the README.

Limitations & Caveats

The plugin currently relies on a local Ollama installation, which is noted as being available as a macOS app, implying potential limitations for other operating systems.

Health Check
Last commit

1 year ago

Responsiveness

1+ week

Pull Requests (30d)
0
Issues (30d)
0
Star History
20 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.