Skip to content
/ LDS-GNN Public

Learning Discrete Structures for Graph Neural Networks (TensorFlow implementation)

License

Notifications You must be signed in to change notification settings

lucfra/LDS-GNN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LDS-GNN

This is the accompany python package for the ICML 2019 paper Learning Discrete Structures for Graph Neural Networks

It implements the method LDS and its variant KNN-LDS and reproduces experiments reported in the paper.

alt text

Requirements

The code is written Python 3.6 and TensorFlow version 1 (tested on versions 1.12 and 1.16). It requires scikit-learn >= 0.21.2 and the python packages

  • FAR-HO, available here (advised branch: final_ICML2019)
  • GCN, available here
Datasets

UCI datasets should be loaded automatically, while graph-based datasets (Cora and Citeseer) are included in the GCN package, available here. Place the relevant files in the folder lds/data.

FMA dataset (we used the small version) should also be downloaded, please email the authors if interested.

Installation (optional)

python setup.py install

The scripts contained in lds.py should work also without installing the package.

Run

Navigate to lds_gnn folder.

The main script is in the file lds.py. The options are

-d: the evaluation dataset. Available datasets are iris, wine, breast_cancer, digits, 20newstrain, 
            20news10, cora, citeseer, fma. Default breast_cancer
-m: the method: lds or knnlds. Default knnlds
-s: the random seed. Default 1
-e: the percentage of missing edges (valid only for cora and citeseer dataset). Default 50

For experiments with incomplete graphs on Cora and Citeseer, run

python lds.py -m lds -e {an integer between 0 and 100} -d {cora or citeseer} -s {if you want to specify random seed}

For experiments in semi-supervised learning (with no input graph), run

python lds.py -m knnlds -d {any available dataset} -s {if you want to specify random seed}

The code will run a small grid search to select some method's parameters such as the (outer) optimization learning rate and the number of truncation steps to compute the hypergradeient. It will output the test accuracy of the best found model, according to the ''early stopping accuracy''. It will also create one log file per single experiment in the folder lds/results, which can be successively loaded (e.g. in a notebook) with the function lds.load_results() for inspection and visualization .

Please note that the package does not include implementations of baseline methods.

Licence

Please take a look at LICENCE.txt

Cite

If you use this package, please cite

@InProceedings{franceschi2019learning,
 title = 	 {Learning Discrete Structures for Graph Neural Networks},
 author = 	 {Luca Franceschi and Mathias Niepert and Massimiliano Pontil and Xiao He},
 booktitle = 	 {Proceedings of the 36th International Conference on Machine Learning},
 year = 	 {2019}
}

About

Learning Discrete Structures for Graph Neural Networks (TensorFlow implementation)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages