Ollama service monitor for availability/performance, with web UI
Top 81.9% on sourcepulse
This project provides a system for monitoring the availability and performance of Ollama services, targeting users who manage or interact with multiple Ollama instances. It offers a modern, multilingual web interface with real-time detection and data visualization, simplifying the management of LLM model deployments.
How It Works
The system is built with Next.js 14, leveraging React Server Components for efficient rendering and TypeScript for type safety. It utilizes Tailwind CSS for styling and next-intl
for its dual-language (Chinese/English) support. The core functionality involves real-time detection of Ollama service status, response times, and TPS (Transactions Per Second), with data export and advanced filtering capabilities.
Quick Start & Requirements
npm install
or yarn install
.npm run dev
or yarn dev
to start the development server.npm run build
followed by npm start
or yarn build
and yarn start
.docker-compose up -d
for full deployment, or docker run
for the web app only.Highlighted Details
Maintenance & Community
The project is maintained by VincentKo (@forrany). Contribution guidelines are provided, encouraging forks and pull requests.
Licensing & Compatibility
Licensed under the MIT License, allowing for commercial use and integration with closed-source projects.
Limitations & Caveats
The project is primarily focused on monitoring Ollama services and may not support other LLM serving frameworks. Some advanced features like FOFA scanning and Redis integration are optional and require specific environment variable configuration.
5 hours ago
1 day