中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Augmentation, Retrieval, Generation: Event Sequence Prediction with a Three-Stage Sequence-to-Sequence Approach

文献类型:会议论文

作者Bo Zhou1,2; Chenhao Wang1,2; Yubo Chen1,2; Kang Liu1,2,3; Jun Zhao1,2; Jiexin Xu4; Xiaojian Jiang4; Qiuxia Li4
出版日期2022
会议日期2022-10
会议地点Gyeongju, Republic of Korea
英文摘要

Being able to infer possible events related to a specific target is critical to natural language processing. One challenging task in this line is \emph{event sequence prediction}, which aims at predicting a sequence of events given a goal. Currently existing approach models this task as a \emph{statistical induction} problem, to predict a sequence of events by exploring the similarity between the given goal and the known sequences of events. However, this statistical based approach is complex and predicts a limited variety of events. At the same time this approach ignores the rich knowledge of external events that is important for predicting event sequences. In this paper, in order to predict more diverse events, we first reformulate the event sequence prediction problem as a sequence generation problem. Then to leverage external event knowledge, we propose a three-stage model including augmentation, retrieval and generation. Experimental results on the event sequence prediction dataset show that our model outperforms existing methods, demonstrating the effectiveness of the proposed model.

会议录出版者ACL
源URL[http://ir.ia.ac.cn/handle/173211/52312]  
专题模式识别国家重点实验室_自然语言处理
通讯作者Bo Zhou
作者单位1.National Laboratory of Pattern Recognition, CASIA
2.School of Artificial Intelligence, University of Chinese Academy of Sciences
3.Beijing Academy of Artificial Intelligence
4.China Merchants Bank
推荐引用方式
GB/T 7714
Bo Zhou,Chenhao Wang,Yubo Chen,et al. Augmentation, Retrieval, Generation: Event Sequence Prediction with a Three-Stage Sequence-to-Sequence Approach[C]. 见:. Gyeongju, Republic of Korea. 2022-10.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。