STG2seq: spatial-temporal graph to sequence model for multi-step passenger demand forecasting

Lei Bai*, Lina Yao, Salil S. Kanhere, Xianzhi Wang, Quan Z. Sheng

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

156 Citations (Scopus)

Abstract

Multi-step passenger demand forecasting is a crucial task in on-demand vehicle sharing services. However, predicting passenger demand over multiple time horizons is generally challenging due to the nonlinear and dynamic spatial-temporal dependencies. In this work, we propose to model multi-step citywide passenger demand prediction based on a graph and use a hierarchical graph convolutional structure to capture both spatial and temporal correlations simultaneously. Our model consists of three parts: 1) a long-term encoder to encode historical passenger demands; 2) a short-term encoder to derive the next-step prediction for generating multi-step prediction; 3) an attention-based output module to model the dynamic temporal and channel-wise information. Experiments on three real-world datasets show that our model consistently outperforms many baseline methods and state-of-the-art models.

Original languageEnglish
Title of host publicationProceedings of the 28th International Joint Conference on Artificial Intelligence
EditorsSarit Kraus
Place of PublicationCalifornia
PublisherInternational Joint Conferences on Artificial Intelligence
Pages1981-1987
Number of pages7
ISBN (Electronic)9780999241141
DOIs
Publication statusPublished - 2019
Event28th International Joint Conference on Artificial Intelligence, IJCAI 2019 - Macao, China
Duration: 10 Aug 201916 Aug 2019

Conference

Conference28th International Joint Conference on Artificial Intelligence, IJCAI 2019
Country/TerritoryChina
CityMacao
Period10/08/1916/08/19

Fingerprint

Dive into the research topics of 'STG2seq: spatial-temporal graph to sequence model for multi-step passenger demand forecasting'. Together they form a unique fingerprint.

Cite this