Discover and explore top open-source AI tools and projects—updated daily.
Tencent-HunyuanMultilingual translation models with advanced features
New!
Top 82.5% on SourcePulse
Tencent-Hunyuan's HY-MT1.5 project delivers advanced neural machine translation models, HY-MT1.5-1.8B and HY-MT1.5-7B, supporting 33 languages and specialized variations. It targets developers needing efficient, high-quality translation, offering competitive performance, edge deployment capabilities, and features for complex scenarios.
How It Works
The project features HY-MT1.5-1.8B and HY-MT1.5-7B models, covering 33 languages and 5 ethnic/dialect variations. HY-MT1.5-7B is an enhanced WMT25 championship model, optimized for explanatory and mixed-language translation. The smaller HY-MT1.5-1.8B model matches larger counterparts' performance with fewer parameters, enabling real-time and edge deployment. Both models support terminology intervention, contextual translation, and formatted translation for nuanced requirements.
Quick Start & Requirements
pip install transformers==4.56.0. A companion transformers branch may be needed.transformers (v4.56.0 recommended). GPU acceleration is advised. Quantized models require tools like AngelSlim or specific deployment frameworks.Highlighted Details
Maintenance & Community
Tencent-Hunyuan actively maintains the project, releasing models on Hugging Face and ModelScope. Contact for feedback is via hunyuan_opensource@tencent.com.
Licensing & Compatibility
The README does not specify a software license, making terms for use, modification, and distribution undefined. Commercial use compatibility is uncertain.
Limitations & Caveats
Requires specific transformers versions, with a note about a "companion branch" suggesting potential version sensitivity. Deployment and quantization may need specialized tools (e.g., AngelSlim) or complex framework setup (TensorRT-LLM, vLLM, SGLang). The absence of a clear license is a significant adoption blocker.
1 week ago
Inactive