Discover and explore top open-source AI tools and projects—updated daily.
simonwAccess OpenRouter LLMs via CLI
Top 93.5% on SourcePulse
This plugin integrates the LLM command-line utility with models hosted on OpenRouter, providing a unified interface to a vast array of LLMs. It targets developers and power users seeking efficient access to diverse AI models, including multimodal and structured output capabilities, directly from their terminal. The primary benefit is streamlined interaction with numerous LLMs without needing individual API integrations for each.
How It Works
The llm-openrouter plugin installs directly into the LLM environment. Users authenticate via an OpenRouter API key, which can be configured as an environment variable or stored using LLM's key management. This allows users to list, select, and prompt any model available through OpenRouter using standard LLM commands, abstracting away the complexities of different model providers.
Quick Start & Requirements
llm install llm-openrouterLLM command-line utility must be installed. An OpenRouter API key is required, obtainable from OpenRouter. Set the key via the OPENROUTER_KEY environment variable or llm keys set openrouter.Highlighted Details
LLM's tool-calling framework.-o online 1 option.llm models -q openrouter, llm openrouter models) to view model details including context length, architecture, and pricing.Maintenance & Community
The plugin is developed by Simon Willison. Specific details regarding community channels, roadmap, or active maintenance beyond the provided README were not present in the documentation snippet.
Licensing & Compatibility
The license type is not specified in the provided README content.
Limitations & Caveats
The quality of schema support varies significantly between models, necessitating careful testing. The availability of features like multimodality or tool use is dependent on the specific OpenRouter-hosted model being utilized.
1 month ago
Inactive
s-kostyaev
olimorris