The official implementation for ICML2025 paper "Supercharging Graph Transformers with Advective Diffusion" (Paper).
AdvDIFFormer is a graph Transformer model derived from the closed-form solution of advective diffusion equation models that are provably resilient to distribution shifts of graph topologies. The model has two implementation versions AdvDIFFormer-i and AdvDIFFormer-s (with linear complexity w.r.t. node numbers).

The model is applied to information networks, dynamic protein interactions and molecular mapping operator generation.

- Information Networks:
node-classification
- Dynamic Protein Interactions:
dppin
- Molecular Mapping Generation:
ham
Please refer to the bash script run.sh
in each folder for running the training and evaluation pipeline.
AdvDIFFormer is built on our early works about scalable graph Transformers:
- NodeFormer: a scalable Transformer with linear complexity
- DIFFormer: a principled Transformer derived from diffusion equations with energy constraint
- SGFormer: a simplified Transformer with single-layer efficient attention and approximation-free linear complexity
If you find our code and model useful, please cite our work. Thank you!
@inproceedings{
wu2025advdifformer,
title={Supercharging Graph Transformers with Advective Diffusion},
author={Qitian Wu and Chenxiao Yang and Kaipeng Zeng and Michael Bronstein},
booktitle={International Conference on Machine Learning (ICML)},
year={2025}
}