中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
A Novel Apex-Time Network for Cross-Dataset Micro-Expression Recognition

文献类型:会议论文

作者Peng, Min3; Wang, Chongyang2; Bi, Tao2; Shi, Yu3; Zhou, Xiangdong3; Chen, Tong1
出版日期2019
会议日期September 3, 2019 - September 6, 2019
会议地点Cambridge, United kingdom
DOI10.1109/ACII.2019.8925525
英文摘要The automatic recognition of micro-expression has been boosted ever since the successful introduction of deep learning approaches. As researchers working on such topics are moving to learn from the nature of micro-expression, the practice of using deep learning techniques has evolved from processing the entire video clip of micro-expression to the recognition on apex frame. Using the apex frame is able to get rid of redundant video frames, but the relevant temporal evidence of micro-expression would be thereby left out. This paper proposes a novel Apex-Time Network (ATNet)to recognize micro-expression based on spatial information from the apex frame as well as on temporal information from the respective-adjacent frames. Through extensive experiments on three benchmarks, we demonstrate the improvement achieved by learning such temporal information. Specially, the model with such temporal information is more robust in cross-dataset validations. © 2019 IEEE.
会议录8th International Conference on Affective Computing and Intelligent Interaction, ACII 2019
语种英语
源URL[http://119.78.100.138/handle/2HOD01W0/9782]  
专题中国科学院重庆绿色智能技术研究院
作者单位1.College of Electronic and Information Engineering, Southwest University, Chongqing, China
2.UCL Interaction Centre, University College London, London, United Kingdom;
3.Intelligent Security Center, Chongqing Institute of Green and Intelligent Technology, Chongqing, China;
推荐引用方式
GB/T 7714
Peng, Min,Wang, Chongyang,Bi, Tao,et al. A Novel Apex-Time Network for Cross-Dataset Micro-Expression Recognition[C]. 见:. Cambridge, United kingdom. September 3, 2019 - September 6, 2019.

入库方式: OAI收割

来源:重庆绿色智能技术研究院

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。