Skip to content

GraphPrompt: Unifying Pre-Training and Downstream Tasks for Graph Neural Networks

Notifications You must be signed in to change notification settings

Starlien95/GraphPrompt

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

68 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

We provide the code (in pytorch) and datasets for our paper "GraphPrompt: Unifying Pre-Training and Downstream Tasks for Graph Neural Networks", which is accepted by WWW2023.

We Further extend GraphPrompt to GraphPrompt+ by enhancing the pre-training and prompting stages "Generalized Graph Prompt: Toward a Unification of Pre-Training and Downstream Tasks on Graphs" which is accepted by IEEE TKDE, the code and datasets are publicly available (https://github.com/gmcmt/graph_prompt_extension).

Description

The repository is organised as follows:

  • data/: contains data we use.
  • graphdownstream/: implements pre-training and downstream tasks at the graph level.
  • nodedownstream/: implements downstream tasks at the node level.
  • convertor/: generate raw data.

Package Dependencies

  • cuda 11.3
  • dgl0.9.0-cu113
  • dgllife

Running experiments

Graph Classification

Default dataset is ENZYMES. You need to change the corresponding parameters in pre_train.py and prompt_fewshot.py to train and evaluate on other datasets.

Pretrain:

  • python pre_train.py

Prompt tune and test:

  • python prompt_fewshot.py

Node Classification

Default dataset is ENZYMES. You need to change the corresponding parameters in prompt_fewshot.py to train and evaluate on other datasets.

Prompt tune and test:

  • python run.py

Citation

@inproceedings{liu2023graphprompt,
title={GraphPrompt: Unifying Pre-Training and Downstream Tasks for Graph Neural Networks},
author={Liu, Zemin and Yu, Xingtong and Fang, Yuan and Zhang, Xinming},
booktitle={Proceedings of the ACM Web Conference 2023},
year={2023}
}

About

GraphPrompt: Unifying Pre-Training and Downstream Tasks for Graph Neural Networks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages