Awesome-Ollama-Server  by forrany

Ollama service monitor for availability/performance, with web UI

Created 6 months ago
354 stars

Top 78.8% on SourcePulse

GitHubView on GitHub
Project Summary

This project provides a system for monitoring the availability and performance of Ollama services, targeting users who manage or interact with multiple Ollama instances. It offers a modern, multilingual web interface with real-time detection and data visualization, simplifying the management of LLM model deployments.

How It Works

The system is built with Next.js 14, leveraging React Server Components for efficient rendering and TypeScript for type safety. It utilizes Tailwind CSS for styling and next-intl for its dual-language (Chinese/English) support. The core functionality involves real-time detection of Ollama service status, response times, and TPS (Transactions Per Second), with data export and advanced filtering capabilities.

Quick Start & Requirements

  • Install: Clone the repository, then run npm install or yarn install.
  • Prerequisites: Node.js 18.0+ and npm or yarn.
  • Development: Run npm run dev or yarn dev to start the development server.
  • Production: Run npm run build followed by npm start or yarn build and yarn start.
  • Docker: docker-compose up -d for full deployment, or docker run for the web app only.
  • Docs: https://github.com/forrany/Awesome-Ollama-Server

Highlighted Details

  • Real-time detection and status display for multiple Ollama services.
  • Performance monitoring including response time and TPS.
  • Multilingual support (Chinese/English) with easy switching.
  • Optional Redis integration for data storage (Upstash).
  • FOFA scanning capabilities for discovering Ollama services.

Maintenance & Community

The project is maintained by VincentKo (@forrany). Contribution guidelines are provided, encouraging forks and pull requests.

Licensing & Compatibility

Licensed under the MIT License, allowing for commercial use and integration with closed-source projects.

Limitations & Caveats

The project is primarily focused on monitoring Ollama services and may not support other LLM serving frameworks. Some advanced features like FOFA scanning and Redis integration are optional and require specific environment variable configuration.

Health Check
Last Commit

18 hours ago

Responsiveness

1 day

Pull Requests (30d)
0
Issues (30d)
0
Star History
6 stars in the last 30 days

Explore Similar Projects

Starred by Tobi Lutke Tobi Lutke(Cofounder of Shopify), Andrej Karpathy Andrej Karpathy(Founder of Eureka Labs; Formerly at Tesla, OpenAI; Author of CS 231n), and
24 more.

open-webui by open-webui

0.6%
110k
Self-hosted AI platform for local LLM deployment
Created 1 year ago
Updated 1 day ago
Feedback? Help us improve.