graph_attention_pool  by bknyaz

PyTorch code for graph attention pooling (NeurIPS 2019 paper)

created 6 years ago
285 stars

Top 92.8% on sourcepulse

GitHubView on GitHub
Project Summary

This repository provides code and data for reproducing experiments from a NeurIPS 2019 paper on Graph Neural Networks (GNNs) with attention mechanisms. It targets researchers and practitioners interested in understanding how attention impacts generalization in GNNs, offering implementations for tasks like image classification (MNIST) and synthetic graph problems (COLORS, TRIANGLES).

How It Works

The project implements a ChebyGIN model, a GNN architecture that incorporates attention for node pooling. This approach allows the model to learn which nodes are most relevant for a given task, improving generalization. The attention mechanism is explored in unsupervised, supervised, and weakly-supervised settings, with the weakly-supervised method showing competitive performance on benchmark datasets.

Quick Start & Requirements

  • Install: pip install torch torchvision numpy scipy scikit-image scikit-learn networkx (PyTorch version >= 0.4.1, Python >= 3.6.1). Ax is required for hyperparameter tuning on specific datasets.
  • Data Generation: Run ./scripts/prepare_data.sh (approx. 1 hour, 2GB).
  • Pretrained Models: Download checkpoints from provided URLs.
  • Examples: Notebooks for evaluating and training models are available (MNIST_eval_models, TRIANGLES_eval_models).
  • Official Docs: Paper, Slides

Highlighted Details

  • Achieves state-of-the-art results on COLORS-Test-LargeC (75 ± 17%) and MNIST-75sp-Test-Noisy (92.3 ± 0.4%) using supervised attention.
  • Demonstrates a weakly-supervised attention approach that performs competitively without requiring ground truth attention during training.
  • Includes data generation scripts for synthetic tasks (COLORS, TRIANGLES) and a preprocessed MNIST dataset (MNIST-75sp).
  • Provides visualization notebooks for understanding dataset structures and attention patterns.

Maintenance & Community

The repository is associated with Boris Knyazev, Graham W. Taylor, and Mohamed Amer. No specific community channels or active maintenance indicators are present in the README.

Licensing & Compatibility

The repository does not explicitly state a license. The code is presented for research purposes, and users should verify licensing for commercial or closed-source applications.

Limitations & Caveats

The README notes that the MNIST experiments were generated without squaring the dist variable in the code, which might affect results if squared. The models were trained on smaller graphs (N<=25 for TRIANGLES) than some test cases.

Health Check
Last commit

4 years ago

Responsiveness

1 week

Pull Requests (30d)
0
Issues (30d)
0
Star History
1 stars in the last 90 days

Explore Similar Projects

Starred by Stas Bekman Stas Bekman(Author of Machine Learning Engineering Open Book; Research Engineer at Snowflake).

fms-fsdp by foundation-model-stack

0%
258
Efficiently train foundation models with PyTorch
created 1 year ago
updated 1 week ago
Starred by Jared Palmer Jared Palmer(Ex-VP of AI at Vercel; Founder of Turborepo; Author of Formik, TSDX), Jiayi Pan Jiayi Pan(Author of SWE-Gym; AI Researcher at UC Berkeley), and
1 more.

weak-to-strong by openai

0%
3k
Weak-to-strong generalization research paper implementation
created 1 year ago
updated 1 year ago
Feedback? Help us improve.