Swift library for local Alpaca-LoRA prediction on Apple devices
Top 80.3% on sourcepulse
A Swift library for running Alpaca-LoRA predictions locally on Apple devices, enabling ChatGPT-like applications. It serves as a Swift API wrapper for alpaca.cpp, allowing developers to integrate local LLM inference into their macOS and iOS applications.
How It Works
AlpacaChat leverages the alpaca.cpp project, which in turn is based on llama.cpp. It utilizes 4-bit quantized GGML model files for efficient local execution. The library provides a straightforward Swift API for loading models and generating text predictions, abstracting away the complexities of the underlying C++ inference engine.
Quick Start & Requirements
AlpacaChatCLI
or AlpacaChatApp
targets using Xcode or the swift build
command.model.bin
) placed in the application's resources. For iOS apps, an AppID and provisioning profile with extended memory usage entitlements are necessary.Highlighted Details
Maintenance & Community
The project is associated with the niw
GitHub user. Further community or maintenance details are not explicitly provided in the README.
Licensing & Compatibility
The README does not specify a license. Compatibility for commercial use or closed-source linking is undetermined.
Limitations & Caveats
Building and running the iOS application requires specific Apple developer configurations, including custom AppIDs and provisioning profiles with extended memory entitlements. Performance on iOS may be significantly impacted if not run in a Release configuration.
2 years ago
1 day