prompt-optimizer  by linshenkx

Prompt optimizer for crafting high-quality prompts

Created 11 months ago
18,517 stars

Top 2.5% on SourcePulse

GitHubView on GitHub
Project Summary

Prompt Optimizer is a web-based and Chrome extension tool designed to help users craft higher-quality AI prompts, thereby improving AI output. It targets AI users, developers, and researchers seeking to enhance their interactions with large language models.

How It Works

The tool functions as a client-side application, processing data directly within the user's browser. It supports multiple AI models, including OpenAI, Gemini, and DeepSeek, allowing for direct comparison of original and optimized prompts. A key architectural choice is its "secure architecture," which ensures data, including API keys, is handled client-side and interacts directly with AI service providers, avoiding intermediate servers for enhanced privacy and security.

Quick Start & Requirements

  • Online Version: Access directly at https://prompt.always200.com.
  • Vercel Deployment: One-click deployment via Vercel or forking the repository. See Vercel部署指南.
  • Chrome Plugin: Install from the Chrome Web Store: Prompt Optimizer.
  • Docker: docker run -d -p 80:80 --restart unless-stopped --name prompt-optimizer linshen/prompt-optimizer. API keys can be passed via environment variables (e.g., -e VITE_OPENAI_API_KEY=your_key). Docker Compose instructions are also available.
  • Local Development: Requires pnpm for dependency management (pnpm install, pnpm dev).
  • Prerequisites: API keys for supported AI models (OpenAI, Gemini, DeepSeek, custom APIs).

Highlighted Details

  • Supports multi-round prompt iteration and optimization.
  • Features real-time comparison between original and optimized prompts.
  • Offers client-side storage for history and API keys with local encryption.
  • Vercel deployment supports an Edge Runtime proxy to mitigate cross-origin issues.

Maintenance & Community

The project is actively maintained, with a roadmap indicating ongoing development. Community discussion can be found via Issues and Pull Requests on GitHub.

Licensing & Compatibility

The project is licensed under the MIT license, permitting commercial use and integration with closed-source projects.

Limitations & Caveats

Using the Vercel proxy feature may trigger risk control mechanisms from some AI service providers, potentially leading to service restrictions. For commercial APIs with strict cross-origin limitations, self-hosting an intermediary API service is recommended.

Health Check
Last Commit

1 week ago

Responsiveness

1 day

Pull Requests (30d)
4
Issues (30d)
7
Star History
861 stars in the last 30 days

Explore Similar Projects

Starred by Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems") and Travis Fischer Travis Fischer(Founder of Agentic).

latitude-llm by latitude-dev

0.4%
4k
Open-source platform for AI prompt engineering
Created 1 year ago
Updated 14 hours ago
Feedback? Help us improve.