ABSA-PyTorch  by songyouwei

PyTorch implementations for aspect-based sentiment analysis

created 7 years ago
2,080 stars

Top 22.0% on sourcepulse

GitHubView on GitHub
Project Summary

This repository provides PyTorch implementations for Aspect-Based Sentiment Analysis (ABSA), a task focused on identifying sentiment towards specific aspects within text. It targets researchers and practitioners in Natural Language Processing (NLP) and sentiment analysis, offering a collection of state-of-the-art models for experimentation and development.

How It Works

The project implements both BERT-based and non-BERT-based neural network architectures for ABSA. BERT-based models leverage pre-trained transformer architectures for enhanced contextual understanding, while non-BERT models utilize architectures like Graph Convolutional Networks (GCNs), Attention mechanisms, and LSTMs. This dual approach allows users to compare performance and choose models suited to their specific data and computational resources.

Quick Start & Requirements

  • Install dependencies: pip install -r requirements.txt
  • Non-BERT models require GloVe pre-trained word vectors (details in data_utils.py).
  • For RTX 30 series GPUs, use requirements_rtx30.txt due to potential CUDA/PyTorch version conflicts.
  • Training: python train.py --model_name bert_spc --dataset restaurant
  • Inference examples: infer_example.py
  • K-fold cross-validation: train_k_fold_cross_val.py
  • Official framework for flexible training/inference: PyABSA

Highlighted Details

  • Implements multiple official BERT-based models (BERT-ADA, BERT-PT, ABSA-BERT-pair, LCF-BERT, AEN-BERT, BERT for Sentence Pair Classification).
  • Includes various non-BERT models (ASGCN, MGAN, AOA, TNet, Cabasc, RAM, MemNet, IAN, ATAE-LSTM, TD-LSTM, LSTM).
  • Offers guidance on hyperparameter sensitivity for BERT models on small datasets.
  • Provides a separate framework, PyABSA, for more streamlined ABSA tasks.

Maintenance & Community

The project acknowledges several contributors and follows the all-contributors specification. Contributions are welcomed.

Licensing & Compatibility

  • License: MIT
  • Compatible with commercial use and closed-source linking.

Limitations & Caveats

Non-BERT-based model training stability can be an issue. BERT-based models require careful hyperparameter tuning, especially learning rate, on smaller datasets. Fine-tuning on specific tasks is recommended to maximize BERT's potential.

Health Check
Last commit

2 years ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
21 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.