OpenClawProBench  by suyoumo

High-performance LLM inference API combining diverse model strengths

Created 1 year ago
288 stars

Top 91.3% on SourcePulse

GitHubView on GitHub
Project Summary

DeepClaude Pro

This project provides a high-performance, OpenAI-compatible LLM API backend, merging DeepSeek R1's reasoning with Claude's creativity and code generation. It targets developers and researchers seeking to leverage combined AI strengths with full control over API keys and data, offering validated effectiveness improvements through synchronous benchmarking.

How It Works

A Rust-based API integrates DeepSeek R1's metacognitive reasoning (CoT, self-correction) and Anthropic's Claude models for superior code generation and creativity. It offers an OpenAI-compatible interface, supporting distinct "full" (programming-optimized) and "normal" (thought-focused) operational modes. This dual-model approach aims for synergistic performance, validated by integrated benchmarking.

Quick Start & Requirements

  • Prerequisites: Rust 1.75+, DeepSeek API key, Anthropic API key.
  • Installation: Clone the repository (git clone https://github.com/getasterisk/deepclaude.git), build the Rust backend (cargo build --release), and run the Node.js frontend (cd frontend && npm run dev).
  • Configuration: Set API keys and operational mode (full/normal) via .env file or frontend settings.
  • Access: Frontend available at http://localhost:3000/chat.
  • Links: Online demo: https://deepclaudepro.com/.

Highlighted Details

  • Performance: High-performance Rust API with zero
Health Check
Last Commit

11 months ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
80 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.