Native client for interacting with LLM services
Top 92.3% on sourcepulse
Heat is a native iOS and macOS client designed to provide a unified interface for interacting with various Large Language Models (LLMs), both cloud-hosted and locally run. It aims to simplify access to popular LLM services and open-source models for users who want to leverage AI capabilities across their Apple devices.
How It Works
Heat utilizes a companion library, Swift GenKit, to abstract the complexities and differences between various LLM providers, including OpenAI, Mistral, Perplexity, Anthropic, and local Ollama deployments. This abstraction allows Heat to offer a consistent user experience regardless of the underlying model or service, supporting features like multi-step tool use, web search integration, and calendar/filesystem access.
Quick Start & Requirements
Highlighted Details
Maintenance & Community
The project is maintained by nathanborror. Further community engagement channels are not explicitly detailed in the README.
Licensing & Compatibility
The README does not specify a license. Compatibility for commercial use or closed-source linking is not detailed.
Limitations & Caveats
The project is currently limited to Apple's ecosystem (iOS and macOS). The original goal of on-device model execution has been deferred due to technical challenges.
2 months ago
Inactive