Question generation study using transformers
Top 34.7% on sourcepulse
This repository provides a straightforward, end-to-end approach to neural question generation (QG) using pre-trained transformer models. It targets researchers and developers looking to implement or experiment with QG, offering simplified data processing, training scripts, and inference pipelines, aiming to make QG more accessible than existing complex methods.
How It Works
The project explores three main QG strategies: answer-aware QG (where the answer is provided), answer extraction models, and end-to-end (answer-agnostic) QG. It leverages the T5 model, adapting it for these tasks through various input formatting techniques like "prepend" and "highlight" to guide the model. A key innovation is the "multitask QA-QG" approach, which fine-tunes a single T5 model to perform answer extraction, question generation, and question answering simultaneously, reducing pipeline complexity.
Quick Start & Requirements
pip install transformers==3.0.0 nltk nlp==0.2.0
(nlp optional for fine-tuning).python -m nltk.downloader punkt
).pipeline("question-generation")
, pipeline("multitask-qa-qg")
, pipeline("e2e-qg")
.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
transformers==3.0.0
, which is an older version and may require careful dependency management or updates for compatibility with newer 🤗 Transformers releases.1 year ago
Inactive