iOS/MacOS app for local LLM inference
Top 24.1% on sourcepulse
LLMFarm is an iOS and macOS application enabling offline execution of various large language models (LLMs) and multimodal models. It targets developers and power users on Apple platforms seeking to test and deploy LLMs locally, leveraging the GGML library for efficient inference.
How It Works
LLMFarm utilizes the GGML library, a C library for machine learning, to run LLMs efficiently on Apple hardware. It supports Metal for GPU acceleration on Apple Silicon Macs, offering faster inference. The application provides a flexible interface for loading diverse model architectures and configuring various sampling methods for fine-tuned output generation.
Quick Start & Requirements
git clone --recurse-submodules https://github.com/guinmoon/LLMFarm
.Highlighted Details
Maintenance & Community
llmfarm_core
) has been moved to a separate repository.rwkv.cpp
, Mia
, LlamaChat
, swift-markdown-ui
, and similarity-search-kit
.Licensing & Compatibility
Limitations & Caveats
4 months ago
Inactive