easyllm  by philschmid

SDK for simplifying LLM interactions

Created 2 years ago
462 stars

Top 65.6% on SourcePulse

GitHubView on GitHub
Project Summary

EasyLLM provides a unified interface for interacting with various Large Language Models (LLMs), abstracting away differences between providers like OpenAI, Hugging Face, and Amazon SageMaker. This allows developers to easily switch between different LLM backends with minimal code changes, streamlining experimentation and deployment for both open-source and proprietary models.

How It Works

EasyLLM implements clients that mimic the OpenAI API's ChatCompletion, Completion, and Embedding interfaces. This compatibility layer enables users to swap out OpenAI calls for equivalent calls to other supported LLM providers (e.g., Hugging Face, SageMaker) by simply changing an import statement and the model identifier. It also includes helper modules for prompt formatting and, in progress, evolutionary instruction generation.

Quick Start & Requirements

  • Install via pip: pip install easyllm
  • Requires Python.
  • Example usage:
from easyllm.clients import huggingface
huggingface.prompt_builder = "llama2"
response = huggingface.ChatCompletion.create(
    model="meta-llama/Llama-2-70b-chat-hf",
    messages=[
        {"role": "system", "content": "\nYou are a helpful assistant speaking like a pirate. argh!"},
        {"role": "user", "content": "What is the sun?"},
    ],
    temperature=0.9,
    top_p=0.6,
    max_tokens=256,
)
print(response)

Highlighted Details

  • OpenAI API compatible clients for ChatCompletion, Completion, and Embedding.
  • Supports switching between OpenAI, Hugging Face, SageMaker, and Amazon Bedrock LLMs.
  • Includes streaming capabilities for completions.
  • Offers prompt utility helpers for format conversion (e.g., OpenAI Messages to Llama 2 prompts).

Maintenance & Community

  • Project maintained by Philipp Schmid.
  • Open to contributions. Development uses hatch.
  • Citation available via BibTeX.

Licensing & Compatibility

  • Licensed under Apache-2.0.
  • Compatible with commercial use and closed-source linking.

Limitations & Caveats

The evol_instruct feature is marked as "work in progress." While the library aims for seamless switching, hyperparameter compatibility between different models may require manual adjustment.

Health Check
Last Commit

1 year ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
2 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.