Ollamac  by kevinhermawan

Mac app for local LLM inference

created 1 year ago
1,868 stars

Top 23.7% on sourcepulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

Ollamac provides a native macOS application for interacting with Ollama, a large language model runner. It offers a user-friendly interface for Mac users to leverage local LLMs, enabling offline AI capabilities with features like syntax highlighting and customizable host settings.

How It Works

Ollamac is a native macOS application built using Swift and SwiftUI. It acts as a client for the Ollama backend, allowing users to select and interact with locally downloaded language models. The app leverages the Ollama API to send prompts and receive responses, displaying them in a chat-like interface with syntax highlighting for code snippets.

Quick Start & Requirements

  • Install via Homebrew: brew install --cask ollamac
  • Requires macOS 14.0 Sonoma or later.
  • Ollama must be installed and have at least one model pulled.
  • Download from releases page.

Highlighted Details

  • Native macOS application for Ollama.
  • Supports all Ollama models.
  • Customizable host settings.
  • Syntax highlighting for responses.
  • Free and open-source.

Maintenance & Community

The project is maintained by Kevin Hermawan. A notable supporter mentioned is BoltAI.

Licensing & Compatibility

Licensed under the Apache License 2.0 with additional restrictions. Commercial use and closed-source linking compatibility should be reviewed against the specific restrictions mentioned in the license file.

Limitations & Caveats

The application is restricted to macOS 14.0 Sonoma or later. The README notes that commercial apps using the "Ollamac" name are not affiliated with the original creator, suggesting potential for confusion or unauthorized derivatives.

Health Check
Last commit

4 months ago

Responsiveness

1 week

Pull Requests (30d)
0
Issues (30d)
0
Star History
71 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.