PyTorch code for graph attention pooling (NeurIPS 2019 paper)
Top 92.8% on sourcepulse
This repository provides code and data for reproducing experiments from a NeurIPS 2019 paper on Graph Neural Networks (GNNs) with attention mechanisms. It targets researchers and practitioners interested in understanding how attention impacts generalization in GNNs, offering implementations for tasks like image classification (MNIST) and synthetic graph problems (COLORS, TRIANGLES).
How It Works
The project implements a ChebyGIN model, a GNN architecture that incorporates attention for node pooling. This approach allows the model to learn which nodes are most relevant for a given task, improving generalization. The attention mechanism is explored in unsupervised, supervised, and weakly-supervised settings, with the weakly-supervised method showing competitive performance on benchmark datasets.
Quick Start & Requirements
pip install torch torchvision numpy scipy scikit-image scikit-learn networkx
(PyTorch version >= 0.4.1, Python >= 3.6.1). Ax
is required for hyperparameter tuning on specific datasets../scripts/prepare_data.sh
(approx. 1 hour, 2GB).MNIST_eval_models
, TRIANGLES_eval_models
).Highlighted Details
Maintenance & Community
The repository is associated with Boris Knyazev, Graham W. Taylor, and Mohamed Amer. No specific community channels or active maintenance indicators are present in the README.
Licensing & Compatibility
The repository does not explicitly state a license. The code is presented for research purposes, and users should verify licensing for commercial or closed-source applications.
Limitations & Caveats
The README notes that the MNIST experiments were generated without squaring the dist
variable in the code, which might affect results if squared. The models were trained on smaller graphs (N<=25 for TRIANGLES) than some test cases.
4 years ago
1 week