RGDAN: A random graph diffusion attention network for traffic prediction

Jin Fan, Wenchao Weng, Hao Tian, Huifeng Wu*, Fu Zhu, Jia Wu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)


Traffic Prediction based on graph structures is a challenging task given that road networks are typically complex structures and the data to be analyzed contains variable temporal features. Further, the quality of the spatial feature extraction is highly dependent on the weight settings of the graph structures. In the transportation field, the weights of these graph structures are currently calculated based on factors like the distance between roads. However, these methods do not take into account the characteristics of the road itself or the correlations between different traffic flows. Existing approaches usually pay more attention to local spatial dependencies extraction while global spatial dependencies are ignored. Another major problem is how to extract sufficient information at limited depth of graph structures. To address these challenges, we propose a Random Graph Diffusion Attention Network (RGDAN) for traffic prediction. RGDAN comprises a graph diffusion attention module and a temporal attention module. The graph diffusion attention module can adjust its weights by learning from data like a CNN to capture more realistic spatial dependencies. The temporal attention module captures the temporal correlations. Experiments on three large-scale public datasets demonstrate that RGDAN produces predictions with 2%–5% more precision than state-of-the-art methods.

Original languageEnglish
Article number106093
Pages (from-to)1-13
Number of pages13
JournalNeural Networks
Publication statusPublished - Apr 2024


  • Traffic prediction
  • Attention networks
  • Spatial–temporal model
  • Graph convolutional network
  • Deep learning
  • Spatial–temporal embedding


Dive into the research topics of 'RGDAN: A random graph diffusion attention network for traffic prediction'. Together they form a unique fingerprint.

Cite this