Discover and explore top open-source AI tools and projects—updated daily.
huggingfaceRun Transformers models directly in your browser
Top 3.3% on SourcePulse
This library enables state-of-the-art machine learning models to run directly in the browser using JavaScript, eliminating the need for server-side infrastructure. It targets web developers and researchers looking to integrate advanced NLP, computer vision, audio, and multimodal capabilities into client-side applications, offering a familiar API to the Hugging Face Python library.
How It Works
Transformers.js leverages ONNX Runtime for efficient execution of models within the browser. Models are converted to the ONNX format, a widely supported intermediate representation, using 🤗 Optimum. This approach allows for broad model compatibility and optimized performance, with support for both CPU (via WebAssembly) and GPU (via WebGPU) execution.
Quick Start & Requirements
npm i @huggingface/transformersHighlighted Details
q4, q8) for reduced bandwidth and faster inference in resource-constrained environments.pipeline API mirroring the Python library for ease of use.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
WebGPU is an experimental API and may have inconsistent support across browsers. Some advanced or less common model architectures might not yet be supported or require specific conversion steps.
6 days ago
1 day
tobegit3hub
towhee-io
xorbitsai
modelscope
huggingface