中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Spatio-Temporal Transformer with Clustering and Dilated Attention for Traffic Prediction

文献类型:会议论文

作者Xu BW(许宝文)1,2; Wang XL(王学雷)1; Liu CB(刘承宝)1; Li S(李铄)1,2; Li JW(李经纬)1,2
出版日期2023-09
会议日期2023-4
会议地点Bilbao, Bizkaia, Spain
英文摘要

Traffic prediction is a crucial task in intelligent
transportation systems, which can help achieve effective management
and optimization of traffic congestion. However, due
to the complexity and uncertainty of traffic systems, accurate
traffic prediction has always been a challenging problem. The
specific challenge of this task is how to model traffic dynamics
along the dimensions of temporal and spatial in a reasonable
manner while respecting and utilizing the spatial and temporal
heterogeneity of traffic data. To address the aforementioned
challenges, this paper proposes a new Transformer-based approach
for traffic prediction. Specifically, to accurately model
complex spatial correlations, we design a spatial Transformer
layer combined with clustering, which reduces computational
complexity and mitigates the risk of over-fitting. To model
dynamic nonlinear temporal correlations, we introduce dilated
attention, which benefits from a global receptive field conducive
to long-term predictions. To validate the effectiveness of our
proposed model, we conduct experiments on four real-world
traffic datasets. The experimental results demonstrate that our
model outperforms state-of-the-art baselines. Furthermore, we
conduct comparative experiments to demonstrate that both the
spatial clustering and dilated attention modules contribute to
the overall improvement of the model’s performance.

源URL[http://ir.ia.ac.cn/handle/173211/57605]  
专题综合信息系统研究中心_工业智能技术与系统
作者单位1.中国科学院自动化研究所
2.中国科学院大学人工智能学院
推荐引用方式
GB/T 7714
Xu BW,Wang XL,Liu CB,et al. Spatio-Temporal Transformer with Clustering and Dilated Attention for Traffic Prediction[C]. 见:. Bilbao, Bizkaia, Spain. 2023-4.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。