Deprecated starter for Transformer-based text classification tasks
Top 88.0% on sourcepulse
This repository provides a starting point for text classification tasks using HuggingFace's PyTorch-Transformers library, targeting researchers and developers needing to quickly implement BERT, XLNet, RoBERTa, and XLM models. It offers code for training and evaluation, simplifying the process of applying these advanced NLP models.
How It Works
The project leverages the PyTorch-Transformers library to facilitate fine-tuning of various transformer architectures for text classification. It provides pre-written notebooks and scripts that handle data preparation, model training, and evaluation, abstracting away much of the low-level implementation details of the HuggingFace library.
Quick Start & Requirements
conda create -n transformers python pandas tqdm jupyter
, conda activate transformers
, conda install pytorch cudatoolkit=10.0 -c pytorch
(or CPU version), conda install scipy scikit-learn
, pip install pytorch-transformers tensorboardX
.Highlighted Details
Maintenance & Community
This repository is deprecated and will not be updated. The author recommends using simpletransformers
, a successor library that is actively maintained and easier to use.
Licensing & Compatibility
The repository does not explicitly state a license. However, it is built upon the HuggingFace pytorch-transformers
library, which is typically under the Apache 2.0 license. Compatibility with commercial or closed-source projects would depend on the underlying library's license.
Limitations & Caveats
The project is deprecated and may not be compatible with current versions of the HuggingFace Transformers library. Users are strongly advised to migrate to the simpletransformers
library for ongoing support and features.
5 years ago
Inactive