Local AI assistant for offline LLM use
Top 0.9% on sourcepulse
Jan is an open-source, offline AI assistant designed as a privacy-focused alternative to services like ChatGPT. It targets users who want local control over their AI interactions, offering a user-friendly interface for downloading and running various Large Language Models (LLMs) on personal hardware.
How It Works
Jan is powered by Cortex.cpp, an embeddable C++ AI engine that acts as a backend. Cortex.cpp supports multiple inference engines, including llama.cpp (default), ONNX, and TensorRT-LLM, providing flexibility across different hardware architectures. This multi-engine approach aims to optimize performance and compatibility for a wide range of devices, from standard PCs to multi-GPU setups.
Quick Start & Requirements
make dev
for development.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
Jan is explicitly stated to be in development, warning of potential breaking changes and bugs. The AGPLv3 license may impose significant restrictions on integration into closed-source commercial products.
1 day ago
1 day