apfel  by Arthur-Ficial

Command-line access to on-device Apple Intelligence LLMs

Created 1 week ago

New!

2,353 stars

Top 19.0% on SourcePulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

Summary

apfel provides command-line access to Apple's on-device Large Language Model (LLM) available on Apple Silicon Macs. It targets developers and power users seeking to leverage local, private AI capabilities without cloud dependencies or API costs. The primary benefit is enabling sophisticated AI interactions directly from the terminal or via an OpenAI-compatible server, enhancing productivity and privacy.

How It Works

The project utilizes Apple's FoundationModels framework (macOS 26+) to interface with the LLM pre-installed on Apple Silicon hardware. apfel acts as a wrapper, exposing this model through a pipe-friendly command-line interface and a local HTTP server. All inference is performed entirely on-device, ensuring data privacy and eliminating network latency or costs associated with cloud-based AI services. Its architecture supports advanced features like tool calling and integrates seamlessly with existing OpenAI SDKs.

Quick Start & Requirements

  • Requirements: Apple Silicon Mac, macOS 26 Tahoe or newer, with Apple Intelligence enabled. Building from source requires Command Line Tools with the macOS 26.4 SDK.
  • Installation (Recommended):
    brew tap Arthur-Ficial/tap
    brew install Arthur-Ficial/tap/apfel
    
  • Installation (Source):
    git clone https://github.com/Arthur-Ficial/apfel.git
    cd apfel
    make install
    
  • Documentation: Troubleshooting and installation details are available at docs/install.md.

Highlighted Details

  • Command-Line Interface: Offers a versatile CLI supporting piped input, file attachments (-f), JSON output (-o json), system prompts (-s), and quiet mode (-q).
  • OpenAI-Compatible Server: Runs a local server at localhost:11434, acting as a drop-in replacement for OpenAI API endpoints and compatible with official SDKs.
  • Tool Calling: Implements function calling with schema conversion, enabling the LLM to interact with external tools.
  • File Integration: Allows attaching file contents directly to prompts for context-aware processing.
  • apfel-gui: A native macOS SwiftUI application provides a graphical interface for chatting, debugging requests, and managing settings.
  • Demo Utilities: Includes scripts like demo/cmd for natural language to shell command conversion.

Maintenance & Community

No specific details regarding maintainers, community channels (e.g., Discord, Slack), or project roadmaps are provided in the README. The project appears to be maintained through standard GitHub development practices.

Licensing & Compatibility

  • License: MIT. This license permits commercial use, modification, and distribution with minimal restrictions, requiring only attribution.
  • Compatibility: Strictly limited to macOS running on Apple Silicon hardware with macOS 26 or later.

Limitations & Caveats

The system is restricted to macOS 26+ on Apple Silicon, utilizing a single, non-configurable apple-foundationmodel. The context window is capped at 4096 tokens (input + output combined), approximately 3000 English words. Users may encounter false positives from Apple's built-in safety guardrails. On-device inference results in response times typically measured in seconds, and the model does not support embeddings or multi-modal (vision) inputs. Certain OpenAI API parameters and features, such as n>1 or embeddings, are explicitly unsupported.

Health Check
Last Commit

12 hours ago

Responsiveness

Inactive

Pull Requests (30d)
14
Issues (30d)
26
Star History
2,565 stars in the last 13 days

Explore Similar Projects

Feedback? Help us improve.