LLM function calling framework for building scalable AI Agents
Top 24.7% on sourcepulse
YoMo is an open-source framework for building scalable, ultra-fast AI agents with LLM function calling capabilities. It targets developers building AI-powered applications that require low-latency, secure, and globally distributed inference, aiming to enhance customer experiences through responsive AI interactions.
How It Works
YoMo leverages a geo-distributed architecture and the QUIC protocol for low-latency communication between AI agents and its core server. It emphasizes type-safe function calling for Go and TypeScript, enabling robust agent development with improved error detection and IDE support. The framework simplifies serverless DevOps for LLM tools, allowing developers to focus on agent functionality rather than infrastructure management.
Quick Start & Requirements
curl -fsSL https://get.yomo.run | sh
yomo serve -c my-agent.yaml
yomo run -n get-weather
Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The framework is primarily focused on LLM function calling and geo-distributed inference; broader AI model support or advanced orchestration features may require custom integration.
5 days ago
1 day