prompt-engine  by microsoft

NPM library for LLM prompt engineering

created 3 years ago
2,693 stars

Top 18.0% on sourcepulse

GitHubView on GitHub
Project Summary

This library provides an NPM utility for developers to construct and manage prompts for Large Language Models (LLMs), simplifying complex prompt engineering tasks and codifying best practices for natural language to code and chat scenarios. It targets developers working with LLMs like GPT-3 and Codex, enabling more effective interaction with these models through structured prompt creation and context management.

How It Works

The library offers generic PromptEngine, CodeEngine, and ChatEngine classes. These engines facilitate prompt composition using a description, input/output examples, and a dialog history. This approach ensures stateless LLMs maintain conversational context, crucial for multi-turn interactions. The CodeEngine is tailored for natural language to code tasks, allowing customization for different programming languages, while ChatEngine manages conversational flow for dialogue-based LLMs.

Quick Start & Requirements

  • Install: npm install prompt-engine
  • Requirements: Node.js environment.
  • Documentation: examples folder

Highlighted Details

  • Supports CodeEngine for NL-to-Code and ChatEngine for conversational prompts.
  • Manages prompt overflow by removing oldest interactions when token limits are reached.
  • Allows prompt representation and loading via YAML for easier management and versioning.
  • Functions include buildPrompt, addInteraction, resetContext, and more for flexible prompt manipulation.

Maintenance & Community

  • Developed by Microsoft.
  • Contributions are welcome, subject to a Contributor License Agreement (CLA).
  • Follows the Microsoft Open Source Code of Conduct.

Licensing & Compatibility

  • License: MIT License.
  • Compatibility: Suitable for commercial use and integration with closed-source applications.

Limitations & Caveats

The library's prompt overflow management strategy removes the oldest interaction, which might lead to loss of early context in very long conversations. The effectiveness of generated prompts is dependent on the capabilities of the target LLM.

Health Check
Last commit

2 years ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
1
Star History
39 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.