Memory Consolidation for Contextual Spoken Language Understanding with Dialogue Logistic Inference
文献类型:会议论文
作者 | He Bai![]() ![]() ![]() ![]() |
出版日期 | 2019 |
会议日期 | 2019 |
会议地点 | Florence, Italia |
英文摘要 | Dialogue contexts are proven helpful in the spoken language understanding (SLU) system and they are typically encoded with explicit memory representations. However, most of the previous models learn the context memory with only one objective to maximizing the SLU performance, leaving the context memory under-exploited. In this paper, we propose a new dialogue logistic inference (DLI) task to consolidate the context memory jointly with SLU in the multi-task framework. DLI is defined as sorting a shuffled dialogue session into its original logical order and shares the same memory encoder and retrieval mechanism as the SLU model. Our experimental results show that various popular contextual SLU models can benefit from our approach, and improvements are quite impressive, especially in slot filling. |
语种 | 英语 |
源URL | [http://ir.ia.ac.cn/handle/173211/26136] ![]() |
专题 | 模式识别国家重点实验室_自然语言处理 |
作者单位 | 中国科学院自动化研究所 |
推荐引用方式 GB/T 7714 | He Bai,Yu Zhou,Jiajun Zhang,et al. Memory Consolidation for Contextual Spoken Language Understanding with Dialogue Logistic Inference[C]. 见:. Florence, Italia. 2019. |
入库方式: OAI收割
来源:自动化研究所
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。