Streaming code highlighting for LLM output
Top 59.8% on sourcepulse
This library provides a streaming syntax highlighter for Shiki, designed to efficiently highlight text streams, particularly useful for real-time outputs from Large Language Models (LLMs). It enables developers to integrate syntax highlighting into dynamic content without blocking the main thread.
How It Works
The core of the library is the CodeToTokenTransformStream
, a Web Stream API transform stream. It takes raw text input and pipes it through Shiki's highlighter, converting code into an array of ThemedToken
objects. This approach allows for incremental processing of text data as it arrives, making it ideal for asynchronous operations. The allowRecalls
option enables more granular token output by handling context-dependent highlighting changes via special "recall" tokens.
Quick Start & Requirements
npm install shiki-stream
CodeToTokenTransformStream
.Highlighted Details
allowRecalls
for fine-grained, context-aware highlighting.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The allowRecalls
feature requires custom handling by the stream consumer to correctly manage context-dependent highlighting updates.
4 months ago
Inactive