Discover and explore top open-source AI tools and projects—updated daily.
chr15mExecutable prompt files for GenAI workflows
Top 70.9% on SourcePulse
This project provides a minimal, single-file Python script for executing .prompt files, which bundle GenAI prompt templates with metadata. It targets developers and power users seeking a dependency-free tool to streamline prompt engineering workflows, enabling quick iteration and execution of complex prompts across various LLM providers.
How It Works
The core of the project is a lightweight Python script that parses .prompt files. These files utilize Handlebars syntax for templating and include frontmatter for specifying the GenAI model, configuration, and output format. The script supports piping standard input ({{STDIN}}) and extracting structured JSON output based on Picoschema definitions, offering a straightforward way to manage and run prompts.
Quick Start & Requirements
curl -O https://raw.githubusercontent.com/chr15m/runprompt/main/runprompt and make it executable with chmod +x runprompt.OPENAI_API_KEY).Highlighted Details
.prompt file.{{STDIN}} for piped input and structured JSON output via Picoschema..prompt files can be made directly executable using a shebang.Maintenance & Community
A TODO.md file is referenced for the project roadmap. No specific community channels or contributor details are mentioned in the provided README.
Licensing & Compatibility
The license type is not explicitly stated in the provided README excerpt.
Limitations & Caveats
The current implementation does not support multi-message prompts, conditional logic within templates ({{#if}}), custom helpers, model configuration parameters (like temperature), partials, or nested Picoschema structures.
4 days ago
Inactive
merrymercy
Shengjia Zhao(Chief Scientist at Meta Superintelligence Lab),
google
grahamjenson
ThilinaRajapakse
google-research
triton-inference-server
tensorflow
visenger