SpeziLLM  by StanfordSpezi

LLM integration for Swift applications

Created 2 years ago
261 stars

Top 97.4% on SourcePulse

GitHubView on GitHub
Project Summary

SpeziLLM provides Swift modules for integrating LLM functionality into applications, supporting local on-device execution, OpenAI's remote APIs, and LLMs running on local network "Fog" nodes. It targets developers building LLM-powered applications within the Spezi ecosystem, offering a unified interface for diverse LLM backends.

How It Works

SpeziLLM acts as a central orchestrator, abstracting the complexities of different LLM platforms. It leverages a LLMRunner that can be configured with specific platform implementations (LLMLocalPlatform, LLMOpenAIPlatform, LLMFogPlatform). This allows developers to switch between or combine LLM sources seamlessly using a consistent Swift API, promoting code reusability and simplifying integration.

Quick Start & Requirements

  • Installation: Add SpeziLLM as a Swift Package dependency in Xcode.
  • Prerequisites: Requires the Spezi core infrastructure. Local execution (SpeziLLMLocal) requires a modern Metal MTLGPUFamily (not compatible with simulators) and may need an "Increase Memory Limit" entitlement. Fog execution (SpeziLLMFog) requires a SpeziLLMFogNode running in the local network and specific Info.plist entries for local network discovery.
  • Setup: Configure the LLMRunner in your SpeziAppDelegate with the desired platform. Refer to DocC documentation for detailed target setup.

Highlighted Details

  • Local Execution: Supports Hugging Face models like Llama3, Phi, Gemma, and DeepSeek-R1 via mlx-swift, with an optional download manager and onboarding view.
  • OpenAI Integration: Provides a Swift API for OpenAI's GPT models, including support for function calling.
  • Fog Computing: Enables LLM inference on distributed local network resources, discovered via mDNS.
  • Swift-Native: Designed for seamless integration into Swift and SwiftUI applications.

Maintenance & Community

  • The project is developed by Stanford University.
  • Contributions are welcome following their guidelines.
  • Links to contribution guidelines and code of conduct are available.

Licensing & Compatibility

  • Licensed under the MIT License, permitting commercial use and integration with closed-source applications.

Limitations & Caveats

  • SpeziLLMLocal is not compatible with simulators due to Metal GPU requirements.
  • SpeziLLMFog requires a separate SpeziLLMFogNode setup and user authorization for local network access.
Health Check
Last Commit

2 days ago

Responsiveness

1 day

Pull Requests (30d)
3
Issues (30d)
2
Star History
12 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.