中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Disentangled Self-Attentive Neural Networks for Click-Through Rate Prediction

文献类型:会议论文

作者Xu, Yichen3; Zhu, Yanqiao1,2; Yu, Feng4; Liu, Qiang1,2; Wu, Shu1,2,5
出版日期2021-12
会议日期2021-12
会议地点Online
页码3263-3267
英文摘要

 Recently, Deep Neural Networks (DNNs) have made remarkable progress for text classification, which, however, still require a large number of labeled data. To train high-performing models with the minimal annotation cost, active learning is proposed to select and label the most informative samples, yet it is still challenging to measure informativeness of samples used in DNNs. In this paper, inspired by piece-wise linear interpretability of DNNs, we propose a novel Active Learning with DivErse iNterpretations (ALDEN) approach. With local interpretations in DNNs, ALDEN identifies linearly separable regions of samples. Then, it selects samples according to their diversity of local interpretations and queries their labels. To tackle the text classification problem, we choose the word with the most diverse interpretations to represent the whole sentence. Extensive experiments demonstrate that ALDEN consistently outperforms several state-of-the-art deep active learning methods.

会议录出版者ACM Press
源URL[http://ir.ia.ac.cn/handle/173211/48468]  
专题自动化研究所_智能感知与计算研究中心
通讯作者Wu, Shu
作者单位1.Center for Research on Intelligent Perception and Computing, Institute of Automation, Chinese Academy of Sciences
2.School of Artificial Intelligence, University of Chinese Academy of Sciences
3.School of Computer Science, Beijing University of Posts and Telecommunications
4.Alibaba Group
5.Artificial Intelligence Research, Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Xu, Yichen,Zhu, Yanqiao,Yu, Feng,et al. Disentangled Self-Attentive Neural Networks for Click-Through Rate Prediction[C]. 见:. Online. 2021-12.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。