roadroller  by lifthrasiir

JS packer for large demos, targeting js13kGames

Created 4 years ago
347 stars

Top 80.0% on SourcePulse

GitHubView on GitHub
Project Summary

Roadroller is a heavyweight JavaScript packer designed to significantly reduce the size of large JavaScript demos, offering up to 15% better compression than standard ZIP/gzip recompressors. It is suitable for projects ranging from 4KB to larger demos, targeting developers who need maximum code size reduction.

How It Works

Roadroller employs advanced data compression techniques, including a Bytewise rANS coder and logistic context mixing, a type of neural network for data compression. It utilizes sparse context models up to 9th order, with models tuned per input using simulated annealing. This approach allows for highly specialized compression, particularly effective on minified JavaScript code.

Quick Start & Requirements

Highlighted Details

  • Achieves up to 15% better compression than ZIP/gzip.
  • Supports custom input types (JS, Text) and actions (eval, write to document).
  • Configurable compression parameters for time/memory trade-offs.
  • Output is ES6 compatible, running in modern browsers and JS environments.

Maintenance & Community

The project is maintained by lifthrasiir. Further community or roadmap details are not explicitly provided in the README.

Licensing & Compatibility

The Roadroller compressor is licensed under the MIT license. The generated decoder code is placed in the public domain. This allows for commercial use and integration into closed-source projects.

Limitations & Caveats

Roadroller is resource-intensive, requiring significant memory and runtime. It is weaker than DEFLATE at exploiting long-distance duplication. Compatibility with MSIE requires transpilation, and potential minor behavioral differences exist due to Math.exp/Math.log approximations across different JS engines, though this is generally not an issue for common engines.

Health Check
Last Commit

3 years ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
3 stars in the last 30 days

Explore Similar Projects

Starred by Shizhe Diao Shizhe Diao(Author of LMFlow; Research Scientist at NVIDIA), Yineng Zhang Yineng Zhang(Inference Lead at SGLang; Research Scientist at Together AI), and
8 more.

EAGLE by SafeAILab

10.6%
2k
Speculative decoding research paper for faster LLM inference
Created 1 year ago
Updated 1 week ago
Starred by Didier Lopes Didier Lopes(Founder of OpenBB), Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems"), and
3 more.

DeepSeek-Coder-V2 by deepseek-ai

0.3%
6k
Open-source code language model comparable to GPT4-Turbo
Created 1 year ago
Updated 11 months ago
Feedback? Help us improve.