PyTorch code for tabular transformers research paper
Top 81.6% on sourcepulse
This repository provides PyTorch code and data for TabFormer, a novel approach to modeling multivariate time series using hierarchical transformers. It addresses the challenge of representing tabular data for time series analysis, targeting researchers and practitioners in time series forecasting and sequence modeling. The benefit is a more effective way to capture complex temporal dependencies within tabular datasets.
How It Works
TabFormer adapts the transformer architecture for tabular data by introducing specialized modules for hierarchical representation. It utilizes modified components from HuggingFace's Transformers library, including a Modified Adaptive Softmax for handling masking and a Modified DataCollatorForLanguageModeling tailored for tabular structures. This approach allows transformers, typically used for sequential text data, to effectively process and learn from structured, multi-field tabular time series.
Quick Start & Requirements
conda env create -f setup.yml
.Highlighted Details
Maintenance & Community
No specific information on maintainers, community channels, or roadmap is provided in the README.
Licensing & Compatibility
The README does not explicitly state a license. Compatibility for commercial use or closed-source linking is not specified.
Limitations & Caveats
The code is tested on specific older versions of dependencies (Python 3.7, PyTorch 1.6.0, Transformers 3.2.0), which may require careful environment management or updates for compatibility with current systems. Accessing the provided dataset requires git-lfs, which can have bandwidth limitations.
2 years ago
Inactive