Emacs package for LLM interaction, abstracting model choice
Top 85.2% on sourcepulse
This Emacs Lisp package provides a unified interface for interacting with various Large Language Models (LLMs), abstracting away differences between providers like OpenAI, Azure, Gemini, Claude, and local models via Ollama or llama.cpp. It enables Emacs users to leverage LLMs for chat, tool use, and embeddings, offering flexibility in choosing between paid APIs and free local models.
How It Works
The library uses a provider-based architecture where specific modules (e.g., llm-openai
, llm-ollama
) create provider objects that abstract LLM interactions. These providers handle features like chat (synchronous, asynchronous, streaming), multi-modal input (images), tool use (calling Elisp functions), and embeddings. The core llm
package provides generic functions that operate on these provider objects, ensuring client code remains provider-agnostic.
Quick Start & Requirements
require
the relevant provider module (e.g., (require 'llm-openai)
).(setq llm-refactoring-provider (make-llm-openai :key "YOUR_API_KEY"))
. API keys can be managed securely using auth-source
.Highlighted Details
ellama
, magit-gptcommit
, and ekg
.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
2 weeks ago
1 day