Swift SDK for local LLM interaction on Apple platforms
Top 50.4% on sourcepulse
LLM.swift is a Swift library designed for easy local interaction with large language models across all Apple platforms. It targets developers building AI-powered applications on macOS, iOS, watchOS, tvOS, and visionOS, offering a simplified interface to models compatible with llama.cpp.
How It Works
LLM.swift acts as a lightweight abstraction layer over the llama.cpp
project. This design choice ensures high performance and compatibility with any model supported by llama.cpp
. It provides a flexible LLM
class that can be subclassed, allowing customization of input preprocessing, output postprocessing, and real-time output updates via closures. Models can be loaded directly from bundled GGUF files or downloaded from Hugging Face.
Quick Start & Requirements
.package(url: "https://github.com/eastriverlee/LLM.swift/", branch: "main")
llama.cpp
compatibility.Highlighted Details
preprocess
, postprocess
, and update
callbacks for fine-grained control.llama.cpp
capabilities.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The README notes that larger models (e.g., 7B parameters) may require significant memory and computational resources, especially on mobile devices, recommending smaller models (3B or less) for better performance. Model compatibility is dependent on llama.cpp
support.
2 weeks ago
1 week