Discover and explore top open-source AI tools and projects—updated daily.
Resource list: Chinese NLP pretrained models, LLMs, multimodal models
Top 9.4% on SourcePulse
This repository serves as a curated collection of high-quality Chinese pre-trained NLP models, including large language models (LLMs), multimodal models, and their associated resources. It aims to provide researchers and developers with a centralized hub for discovering and accessing state-of-the-art models for Chinese natural language processing tasks.
How It Works
The project meticulously gathers and organizes information on a vast array of Chinese NLP models, categorizing them by architecture (e.g., BERT, GPT, T5, RoFormer), domain (e.g., general, finance, medical, code), and modality (text-only, multimodal). It provides links to Hugging Face, model repositories, papers, and project pages, facilitating easy access and evaluation.
Quick Start & Requirements
🤗HF
) or ModelScope.transformers
or modelscope
.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
1 month ago
1 day