STAN: Spatio-Temporal Attention Network for Next Point-of-Interest Recommendation
文献类型:会议论文
作者 | Luo, Yingtao1; Liu, Qiang2,4![]() |
出版日期 | 2021-04 |
会议日期 | 2021.04.19-2021.04.23 |
会议地点 | Ljubljana, Slovenia |
英文摘要 | The next location recommendation is at the core of various location-based applications. Current state-of-the-art models have attempted to solve spatial sparsity with hierarchical gridding and model temporal relation with explicit time intervals, while some vital questions remain unsolved. Non-adjacent locations and non-consecutive visits provide non-trivial correlations for understanding a user’s behavior but were rarely considered. To aggregate all relevant visits from user trajectory and recall the most plausible candidates from weighted representations, here we propose a Spatio-Temporal Attention Network (STAN) for location recommendation. STAN explicitly exploits relative spatiotemporal information of all the check-ins with self-attention layers along the trajectory. This improvement allows a point-to-point interaction between non-adjacent locations and non-consecutive check-ins with explicit spatio-temporal effect. STAN uses a bi-layer attention architecture that firstly aggregates spatiotemporal correlation within user trajectory and then recalls the target with consideration of personalized item frequency (PIF). By visualization, we show that STAN is in line with the above intuition. Experimental results unequivocally show that our model outperforms the existing state-of-the-art methods by 9-17%. |
源URL | [http://ir.ia.ac.cn/handle/173211/47489] ![]() |
专题 | 自动化研究所_智能感知与计算研究中心 |
通讯作者 | Liu, Qiang |
作者单位 | 1.University of Washington 2.University of Chinese Academy of Sciences 3.Renmin University of China 4.Institute of Automation, Chinese Academy of Sciences |
推荐引用方式 GB/T 7714 | Luo, Yingtao,Liu, Qiang,Liu, Zhaocheng. STAN: Spatio-Temporal Attention Network for Next Point-of-Interest Recommendation[C]. 见:. Ljubljana, Slovenia. 2021.04.19-2021.04.23. |
入库方式: OAI收割
来源:自动化研究所
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。