Discover and explore top open-source AI tools and projects—updated daily.
ai-dynamoBenchmark generative AI model performance across diverse inference solutions
Top 99.9% on SourcePulse
Summary
AIPerf is a comprehensive benchmarking tool for measuring generative AI model performance across various inference solutions. It provides detailed metrics via a command-line display and extensive reports, targeting engineers and researchers needing to evaluate and optimize AI model deployments.
How It Works
It utilizes a scalable multiprocess architecture with services communicating via ZMQ. AIPerf supports diverse benchmarking modes (concurrency, request-rate, trace replay) and offers three UI options: a real-time TUI dashboard, simple progress bars, or headless execution. A key advantage is its extensibility through a plugin system for custom endpoints, datasets, transports, and metrics.
Quick Start & Requirements
pip install aiperf within a Python 3 virtual environment.Highlighted Details
Maintenance & Community
A CONTRIBUTING.md file is provided for development setup and contribution guidelines. No specific community links (Discord, Slack), maintainer information, sponsorships, or roadmap details are present in the README.
Licensing & Compatibility
The README does not explicitly state the project's license. This omission requires clarification for determining commercial use or closed-source linking compatibility.
Limitations & Caveats
Output sequence length constraints may not be guaranteed without specific inference server support. High concurrency settings (>15,000) might cause port exhaustion. Startup errors from invalid configurations can lead to indefinite hangs. Dashboard UI text copying may be unreliable; use the 'c' key for full log copy.
10 hours ago
Inactive
openai
Lightning-AI
vladmandic
vllm-project