openai-apps-sdk-examples  by openai

Examples for building rich ChatGPT applications with the Apps SDK

Created 1 week ago

New!

1,399 stars

Top 28.9% on SourcePulse

GitHubView on GitHub
Project Summary

Summary

This repository provides example applications for OpenAI's Apps SDK, showcasing UI components and Model Context Protocol (MCP) servers. It serves as a starting point and inspiration for developers building custom applications for ChatGPT, enabling rich, interactive user experiences by connecting LLM clients to external tools and data.

How It Works

The project leverages the Model Context Protocol (MCP), an open specification for integrating large language model clients with external tools, data, and user interfaces. MCP servers expose tools with defined JSON Schema contracts, allowing models to call them. Crucially, MCP servers can return structured content alongside metadata, such as inline HTML, which the Apps SDK interprets to render rich UI components (widgets) directly within the ChatGPT interface, keeping the server, model, and UI synchronized.

Quick Start & Requirements

  • Prerequisites: Node.js 18+, pnpm (recommended) or npm/yarn, Python 3.10+ (for Python MCP servers).
  • Install Dependencies: Clone the repository and run pnpm install (or npm install/yarn install).
  • Build Components: Execute pnpm run build to bundle UI components into static assets located in the assets/ directory.
  • Local Development Server: Use pnpm run dev for live development.
  • Serve Static Assets: Run pnpm run serve to host generated bundles at http://localhost:4444 with CORS enabled.
  • Run MCP Servers:
    • Pizzaz (Node.js): cd pizzaz_server_node && pnpm start
    • Pizzaz (Python): Set up a Python virtual environment (python -m venv .venv, source .venv/bin/activate), install requirements (pip install -r pizzaz_server_python/requirements.txt), and run uvicorn pizzaz_server_python.main:app --port 8000.
    • Solar System (Python): Similar Python setup, run uvicorn solar-system_server_python.main:app --port 8000.
  • Testing in ChatGPT: Enable developer mode and add local servers via Settings > Connectors, potentially using a tool like ngrok to expose local ports (e.g., ngrok http 8000).

Highlighted Details

  • Demonstrates how MCP servers can combine plain text, JSON, and _meta.openai/outputTemplate metadata to hydrate matching UI widgets.
  • Supports both Server-Sent Events and streaming HTTP for MCP transport.
  • Provides a clear structure for creating custom components and customizing widget data by editing server handlers.
  • Bundled assets include necessary CSS, allowing direct hosting or shipping with custom servers.

Maintenance & Community

Contributions via issues or PRs are welcome, though the project notes that not all suggestions may be reviewed. No specific community channels (e.g., Discord, Slack) or roadmap links are provided in the README.

Licensing & Compatibility

This project is licensed under the MIT License, which is permissive for commercial use and integration into closed-source projects.

Limitations & Caveats

The repository serves as a gallery of examples and a starting point, requiring developer effort to adapt for production use. The contribution review process may not guarantee all submitted improvements will be integrated.

Health Check
Last Commit

6 days ago

Responsiveness

Inactive

Pull Requests (30d)
15
Issues (30d)
34
Star History
1,477 stars in the last 9 days

Explore Similar Projects

Feedback? Help us improve.