openai-oauth  by EvanZhouDev

Free OpenAI API access via ChatGPT account

Created 1 month ago
328 stars

Top 83.3% on SourcePulse

GitHubView on GitHub
Project Summary

This project provides a method to access OpenAI's API using existing ChatGPT account credentials, bypassing the need for traditional API keys and associated costs. It targets developers and power users seeking to experiment with or integrate OpenAI models locally, offering an OpenAI-compatible endpoint powered by their personal ChatGPT account's OAuth tokens. The primary benefit is enabling free, local access to OpenAI's language models, subject to the user's ChatGPT plan limitations.

How It Works

The project functions by establishing a local proxy server that authenticates using the user's existing OpenAI OAuth tokens, typically found in a auth.json file generated by the official Codex CLI login process. This proxy exposes an OpenAI-compatible API endpoint (e.g., http://127.0.0.1:10531/v1), allowing applications to interact with OpenAI models as if they were using the standard API. It leverages OpenAI's internal Codex API endpoints (chatgpt.com/backend-api/codex/responses) and supports key features like streaming responses and tool calls, offering a novel approach to API access by repurposing existing user authentication.

Quick Start & Requirements

  • Primary Install/Run: Use npx openai-oauth to run the CLI proxy. For integration, use createOpenAIOauth from openai-oauth-provider.
  • Prerequisites: Requires Node.js and npm. Users must have an existing ChatGPT account and generate an authentication file by running npx @openai/codex login.
  • Dependencies: The provider package depends on the Vercel AI SDK.
  • Configuration: Options include host binding, port, model allowlisting, upstream base URL, and OAuth token URLs.
  • Documentation: Links to official quick-start or demo pages are not explicitly provided in the README.

Highlighted Details

  • Provides a fully OpenAI-compatible API endpoint (/v1).
  • Eliminates the need for API keys by utilizing user OAuth tokens.
  • Supports streaming responses and tool calls for dynamic interactions.
  • Models are discovered based on the user's account access by default.

Maintenance & Community

This is described as an "unofficial, community-maintained project" and is not affiliated with OpenAI. The README does not provide details on specific contributors, sponsorships, community channels (like Discord/Slack), or a public roadmap.

Licensing & Compatibility

The specific open-source license for this project is not stated in the provided README. Users are warned that it is provided "as is" with no warranties. Commercial use, running as a hosted service, sharing access, or pooling/redistributing tokens are explicitly discouraged. Users are solely responsible for complying with OpenAI's Terms of Service and applicable agreements.

Limitations & Caveats

The login flow is not bundled and requires a separate step (npx @openai/codex login). The CLI proxy is stateless and does not support stateful replay for the /v1/responses endpoint. Only LLMs supported by the user's Codex plan are available. Crucially, the project's licensing is not specified, and its use is restricted to personal, local experimentation on trusted machines due to the sensitive nature of the authentication tokens involved.

Health Check
Last Commit

2 weeks ago

Responsiveness

Inactive

Pull Requests (30d)
8
Issues (30d)
1
Star History
186 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.