This Elixir library provides a LangChain-style framework for integrating Large Language Models (LLMs) into Elixir applications. It enables developers to build data-aware and agentic applications by chaining LLMs with various data sources, services, and custom Elixir functions, targeting Elixir developers seeking to leverage AI capabilities.
How It Works
The framework offers modular components and off-the-shelf chains for interacting with LLMs. It emphasizes a functional approach, distinct from its OOP-based Python and JavaScript counterparts, focusing on direct LLM interaction rather than complex state management for conversational history. Key features include exposing custom Elixir functions to LLMs via LangChain.Function
for application integration and supporting various LLM providers and self-hosted models.
Quick Start & Requirements
- Install via
mix.exs
: {:langchain, "0.4.0-rc.0"}
or {:langchain, "0.3.3"}
.
- Configuration requires API keys for services like OpenAI and Anthropic, which can be set via environment variables or directly in
config/runtime.exs
.
- Supports OpenAI, Anthropic, Google Gemini/Vertex AI, Ollama, Mistral, Bumblebee models, and LMStudio via OpenAI compatibility.
- Documentation: https://github.com/brainlid/langchain#documentation
- Demo Project: Available for download.
- Example Livebook: "LangChain: Executing Custom Elixir Functions".
Highlighted Details
- Supports OpenAI, Anthropic, Google Gemini/Vertex AI, Ollama, Mistral, and Bumblebee models.
- Enables exposing custom Elixir functions to LLMs for application integration.
- Supports OpenAI-compatible APIs for self-hosted or alternative LLM services.
- Includes prompt caching mechanisms for OpenAI and Gemini models.
Maintenance & Community
- The project is actively developed, with release candidates (v0.4.x) introducing new features and breaking changes.
- Testing instructions are provided, including options for running tests against live API calls.
Licensing & Compatibility
- The library is available under the MIT license, permitting commercial use and integration with closed-source applications.
Limitations & Caveats
- Function calling for Bumblebee models is currently limited to Llama 3.1; JSON tool calling is not supported for Llama 2, Mistral, and Zephyr.
- Google Gemini and Vertex AI support is marked as deprecated in v0.4.x.