Discover and explore top open-source AI tools and projects—updated daily.
sheldonresearchUnified library for graph neural network prompting
Top 57.0% on SourcePulse
<2-3 sentences summarising what the project addresses and solves, the target audience, and the benefit.> ProG is a unified Python library on PyTorch for Graph Prompting, simplifying single/multi-task prompting for pre-trained Graph Neural Networks (GNNs). It enables diverse graph workflows, including supervised learning, pre-training, and fine-tuning, serving as a benchmark for researchers and practitioners. ProG supports numerous GNN models, pre-training strategies, and datasets, streamlining complex graph analysis tasks.
How It Works
ProG integrates over five prompt models (e.g., All-in-One, GPPT, GPF Plus) and six pre-training strategies (e.g., DGI, GraphMAE, GraphCL) with various GNN backbones such as GCN, GraphSAGE, GAT, GT, and PyG-compatible models. It handles node and graph-level tasks across more than 15 diverse datasets, encompassing both homophilic and heterophilic graph structures. The library provides pre-trained models and tools for custom GNN pre-training.
Quick Start & Requirements
Installation is straightforward via pip install prompt-graph. Key requirements include Python version 3.9 or higher, PyTorch, and a CUDA-enabled GPU. The setup process involves creating a Conda environment and installing PyTorch and DGL with appropriate CUDA support. Users must download an Experiment.zip archive (126MB) containing datasets, pre-trained models, and induced graphs. Example commands are provided for running downstream tasks (downstream_task.py) and benchmarking (bench.py).
Highlighted Details
ProG received the KDD 2023 Best Research Paper Award for its "All in One" approach. It functions as a comprehensive benchmark, supporting a wide array of graph prompt models and pre-training strategies, and has been tested on over 15 datasets. The library integrates seamlessly with PyTorch Geometric (PyG). Recent updates include theoretical analyses and benchmark papers accepted at NeurIPS 2024.
Maintenance & Community
The project is led by Dr. Xiangguo SUN, with significant contributions from Prof. Jia LI and Prof. Hong CHENG. Development is active, with the main branch updated in June 2024 and a stable branch currently in progress. Key contributors are listed, but no community forum links (like Discord or Slack) are provided.
Licensing & Compatibility
The library is released under the MIT license, which generally permits broad use, including commercial applications and linking with closed-source software.
Limitations & Caveats
Dataset compatibility issues may arise with pre-trained models due to potential dataset updates; self-pre-training is recommended for specific datasets like ENZYMES and BZR. The "stable" branch is approximately 20% complete. The project's TODO list indicates planned additions such as more pre-training methods, new prompt types, and enhancements to data handling utilities.
5 months ago
Inactive
dmlc