Skip to content

Code for "GraphFM: A generalist graph transformer that learns transferable representations across diverse domains" published at TMLR 2025

License

Notifications You must be signed in to change notification settings

nerdslab/GraphFM

Repository files navigation

GraphFM

Official Implementation of GraphFM: A generalist graph transformer that learns transferable representations across diverse domains

Warning

This repository is under construction. We provide code for training our largest GraphFM model on an 8-GPU cluster. We will update the repository to include extensive documentation, support for finetuning, more datasets, as well as weights for pretrained models.

Setup

To set up a Python virtual environment with the required dependencies, run:

python3 -m venv graphfm_env
source graphfm_env/bin/activate
pip install --upgrade pip

Follow instructions to install PyTorch 1.9.1 and PyG:

pip install torch==1.9.1+cu111 -f https://download.pytorch.org/whl/torch_stable.html
pip install torch-scatter torch-sparse torch-cluster torch-spline-conv torch-geometric -f https://data.pyg.org/whl/torch-1.9.0+cu111.html
pip install absl-py==0.12.0 tensorboard==2.6.0 ogb

The code uses PyG (PyTorch Geometric). All datasets are available through this package.

Download Network Repo Datasets

cd network_repo_download
python regex.py
sh unzip.sh

Preprocess Data

python preprocess_datasets.py --cfg configs/pretrain_model.yaml

Pretrain Model

python main.py --cfg configs/pretrain_model.yaml

Citation

If you find the code useful for your research, please consider citing our work:

@article{lachi2025graphfm,
  title={GraphFM: A generalist graph transformer that learns transferable representations across diverse domains},
  author={Lachi, Divyansha and Azabou, Mehdi and Arora, Vinam and Dyer, Eva L},
  journal={Transactions on Machine Learning Research},
  year={2025},
  url={https://openreview.net/forum?id=sZTpRfRUtR},
}

About

Code for "GraphFM: A generalist graph transformer that learns transferable representations across diverse domains" published at TMLR 2025

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •