llm-axe  by emirsahin1

Toolkit for LLM-powered applications

created 1 year ago
260 stars

Top 98.2% on sourcepulse

GitHubView on GitHub
Project Summary

This toolkit provides simple abstractions for common LLM application development tasks, targeting developers who want to integrate LLM capabilities without the complexity of larger frameworks. It offers features like automatic schema generation, pre-made agents with chat history, and customizable agents, enabling rapid implementation of LLM-powered features.

How It Works

The library abstracts common LLM interactions into intuitive classes and functions. It supports local LLMs (via Ollama) and external APIs, offering pre-built agents for tasks like web searching, PDF summarization, data extraction, and object detection. Its design emphasizes flexibility, allowing users to easily integrate their own LLMs and customize agent behavior with system prompts.

Quick Start & Requirements

  • Primary install: pip install llm-axe
  • Prerequisites: Python 3.x. Testing was performed with llama3:8b:instruct (4-bit quantization).
  • Documentation: Development Documentation (Note: The provided README is the primary source of information.)

Highlighted Details

  • Function Calling: Enables LLM function calling with minimal code, without requiring pre-made schemas or specialized prompts.
  • Online Agent: Facilitates internet access for LLMs to retrieve and process information from websites.
  • PDF Reader & Data Extractor: Agents specifically designed to read PDF documents and extract structured information based on specified fields.
  • Object Detector Agent: Integrates with multimodal LLMs (like LLaVA) for image analysis and object detection.

Maintenance & Community

  • Community support is available via Discord.

Licensing & Compatibility

  • The license is not explicitly stated in the provided README. Compatibility for commercial use or closed-source linking is therefore undetermined.

Limitations & Caveats

The quality of results is highly dependent on the capabilities of the underlying LLM used. The project was tested with llama3:8b:instruct, and performance with other models may vary. The license status requires clarification for commercial adoption.

Health Check
Last commit

6 months ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
26 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.