Discover and explore top open-source AI tools and projects—updated daily.
Go library for ONNX transformer pipelines
Top 64.9% on SourcePulse
This library provides ONNX transformer pipelines for Go, enabling AI use cases like text generation and image classification directly within Go applications. It targets Go developers and ML engineers seeking to deploy and scale Hugging Face transformer models efficiently on their own hardware, bypassing the need for Python RPC services.
How It Works
Hugot leverages ONNX models for compatibility with Hugging Face's ecosystem, allowing models trained in Python to be exported and run with identical predictions in Go. It supports pluggable backends, including a native Go implementation (GoMLX), ONNX Runtime (ORT), and OpenXLA, with ORT and OpenXLA supporting GPU acceleration via CUDA. The library prioritizes ease of use and performance for production environments.
Quick Start & Requirements
github.com/knights-analytics/hugot
) or a CLI.-tags ORT
, -tags XLA
, -tags ALL
) and potentially downloading .so
or .a
files for ONNX Runtime and tokenizers, or using the provided Docker image.hugot_test.go
and within the README.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The library and CLI are currently only built and tested on amd64-linux
. Untested accelerator backends include TensorRT, DirectML, CoreML, and OpenVINO. Training is limited to the FeatureExtractionPipeline and requires the XLA backend.
2 weeks ago
Inactive