BERT_Paper_Chinese_Translation  by yuanxiaosc

Chinese translation of the BERT research paper

created 6 years ago
688 stars

Top 50.4% on sourcepulse

GitHubView on GitHub
Project Summary

This repository provides a complete Chinese translation of the seminal BERT paper, "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding." It aims to assist Chinese-speaking researchers and practitioners in understanding BERT's groundbreaking approach to language representation.

How It Works

The project translates the original English paper, including all figures and references, into Chinese. It highlights BERT's core innovation: pre-training deep bidirectional representations using a novel "Masked Language Model" (MLM) objective, inspired by the cloze task. This bidirectional approach, unlike prior unidirectional models (e.g., OpenAI GPT, ELMo), allows BERT to condition on both left and right context simultaneously, leading to superior performance across a wide range of NLP tasks with minimal task-specific architectural changes.

Quick Start & Requirements

  • The primary content is available as a .md file within the repository.
  • No specific software or hardware requirements are listed for accessing the translation itself.
  • Links to the original paper and pre-trained models are provided.

Highlighted Details

  • Offers a full Chinese translation of the BERT paper, including figures and references.
  • Provides links to all cited resources for further research.
  • Notes that this is a translation of the November 2018 version of the BERT paper, with minor differences from the May 2019 v2.
  • Mentions that the author plans to translate and analyze more deep learning and NLP papers.

Maintenance & Community

  • The repository is maintained by yuanxiaosc.
  • The author encourages users to star the repository if they find the work helpful.
  • Contact information for commercial use inquiries is provided.

Licensing & Compatibility

  • The README states "转载请注明出处,商用请联系译者" (Please cite the source for reproduction, contact the translator for commercial use). This implies a non-commercial license for the translation itself, while the original BERT paper's licensing would apply to the model.

Limitations & Caveats

  • This repository contains only the translation of the paper, not the BERT model implementation or code.
  • The translation is based on an earlier version of the BERT paper.
Health Check
Last commit

5 years ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
4 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.