Bilingual model for research and evaluation
Top 6.9% on sourcepulse
GLM-130B is an open-source, 130-billion parameter, bilingual (English/Chinese) language model designed for researchers and developers working with large-scale NLP models. It offers strong performance on various benchmarks and supports efficient inference through quantization and optimized libraries.
How It Works
GLM-130B utilizes the General Language Model (GLM) pre-training approach, which combines autoregressive blank infilling with bidirectional context. This allows it to excel at both left-to-right generation and filling in masked segments of text. The model is designed for efficient deployment, supporting INT4 quantization to enable inference on consumer-grade hardware.
Quick Start & Requirements
2 years ago
Inactive