PLMpapers  by thunlp

PLM papers list

created 5 years ago
3,357 stars

Top 14.8% on sourcepulse

GitHubView on GitHub
Project Summary

This repository serves as a curated list of essential academic papers on Pre-trained Language Models (PLMs), targeting NLP researchers and practitioners. It aims to provide a structured overview of the field's foundational and recent advancements, including a visual representation of paper relationships and links to code and models.

How It Works

The repository organizes papers by topic, such as core PLM research, model compression, analysis, and fine-tuning techniques. It includes a survey paper and a diagram illustrating the connections between various PLM works, offering a high-level understanding of the field's evolution.

Quick Start & Requirements

No installation or execution is required. The repository is a static collection of links to papers, code, and models.

Highlighted Details

  • Comprehensive coverage of PLMs from 2015 to 2020, including seminal works like BERT, GPT, XLM, and T5.
  • Dedicated sections for model compression, analysis, and fine-tuning strategies.
  • Links to official code repositories and pre-trained models for many listed papers.
  • Includes a survey paper "Pre-trained Models: Past, Present and Future."

Maintenance & Community

Maintained by Xiaozhi Wang and Zhengyan Zhang. Suggestions and corrections are welcomed.

Licensing & Compatibility

The content is freely distributable and usable. Specific licenses for linked code and models would depend on their respective repositories.

Limitations & Caveats

The repository is a curated list and does not provide direct access to the papers themselves, only links. It focuses on papers up to late 2020, and newer advancements may not be included.

Health Check
Last commit

2 years ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
10 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.