req_llm  by agentjido

Elixir SDK for standardized LLM API interaction

Created 2 months ago
296 stars

Top 89.6% on SourcePulse

GitHubView on GitHub
Project Summary

<2-3 sentences summarising what the project addresses and solves, the target audience, and the benefit.> ReqLLM provides a unified, idiomatic Elixir interface for interacting with diverse Large Language Model (LLM) APIs. It standardizes requests and responses across numerous providers, simplifying integration for Elixir developers building AI applications by abstracting away provider-specific complexities.

How It Works

A two-layer architecture features a high-level API (Vercel AI SDK-inspired) for common tasks and a low-level Req plugin API for full HTTP control. Built on OpenAI Chat Completions as a baseline, it uses provider-specific callbacks to adapt non-compatible APIs, ensuring a consistent developer experience.

Quick Start & Requirements

Standard Elixir project setup (mix deps.get) is required. API keys are securely managed via multiple layers (per-request override to .env). Elixir and standard development tools are necessary. A Discord community is available for support.

Highlighted Details

  • Supports 45+ providers and 665+ models via an auto-synced registry with metadata.
  • Features a canonical data model and supports multi-modal content parts (text, image URLs, tool calls).
  • Offers high-level (Vercel-AI style) and low-level (Req plugin) client layers.
  • Enables structured object generation with schema validation and OpenAI-native structured outputs.
  • Provides production-grade streaming via Finch with concurrent metadata collection.
  • Includes detailed usage and cost tracking (tokens, USD) per response.

Maintenance & Community

Currently in release candidate status (v1.0.0-rc.6), actively seeking community feedback. A Discord server is available. High code quality is maintained via Dialyzer/Credo, supported by an extensive fixture-based test suite covering 130+ models across 10 providers, regularly refreshed against live APIs.

Licensing & Compatibility

Licensed under the Apache License, Version 2.0, permitting commercial use and integration into closed-source applications.

Limitations & Caveats

Streaming uses Finch directly due to Req limitations with SSE. Default Finch pools use HTTP/1 to bypass a Finch bug (#265) with large HTTP/2 request bodies. The project is in release candidate status, implying potential pre-1.0.0 API changes.

Health Check
Last Commit

1 day ago

Responsiveness

Inactive

Pull Requests (30d)
57
Issues (30d)
34
Star History
92 stars in the last 30 days

Explore Similar Projects

Starred by Adam Wolff Adam Wolff(Claude Code Core; MTS at Anthropic), Samuel Colvin Samuel Colvin(Founder and Author of Pydantic), and
5 more.

anthropic-sdk-python by anthropics

0.8%
2k
Python SDK for Anthropic's REST API
Created 2 years ago
Updated 1 day ago
Feedback? Help us improve.