Vectorized Evidential Learning for Weakly-Supervised Temporal Action Localization
文献类型:期刊论文
作者 | Gao, Junyu2,3; Chen, Mengyuan2,3; Xu, Changsheng1,2,3 |
刊名 | IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE |
出版日期 | 2023-12-01 |
卷号 | 45期号:12页码:15949-15963 |
ISSN号 | 0162-8828 |
关键词 | Uncertainty Location awareness Reliability Videos Noise measurement Estimation Deep learning Weakly-supervised learning temporal action localization evidential deep learning uncertainty estimation |
DOI | 10.1109/TPAMI.2023.3311447 |
通讯作者 | Xu, Changsheng(csxu@nlpr.ia.ac.cn) |
英文摘要 | With the explosive growth of videos, weakly-supervised temporal action localization (WS-TAL) task has become a promising research direction in pattern analysis and machine learning. WS-TAL aims to detect and localize action instances with only video-level labels during training. Modern approaches have achieved impressive progress via powerful deep neural networks. However, robust and reliable WS-TAL remains challenging and underexplored due to considerable uncertainty caused by weak supervision, noisy evaluation environment, and unknown categories in the open world. To this end, we propose a new paradigm, named vectorized evidential learning (VEL), to explore local-to-global evidence collection for facilitating model performance. Specifically, a series of learnable meta-action units (MAUs) are automatically constructed, which serve as fundamental elements constituting diverse action categories. Since the same meta-action unit can manifest as distinct action components within different action categories, we leverage MAUs and category representations to dynamically and adaptively learn action components and action-component relations. After performing uncertainty estimation at both category-level and unit-level, the local evidence from action components is accumulated and optimized under the Subject Logic theory. Extensive experiments on the regular, noisy, and open-set settings of three popular benchmarks show that VEL consistently obtains more robust and reliable action localization performance than state-of-the-arts. |
WOS关键词 | UNCERTAINTY |
资助项目 | National Key Research and Development Plan of China[2020AAA0106200] ; National Natural Science Foundation of China[62036012] ; National Natural Science Foundation of China[62236008] ; National Natural Science Foundation of China[U21B2044] ; National Natural Science Foundation of China[61721004] ; National Natural Science Foundation of China[62102415] ; National Natural Science Foundation of China[62072286] ; National Natural Science Foundation of China[62106262] ; National Natural Science Foundation of China[62002355] ; Beijing Natural Science Foundation[L201001] ; Open Research Projects of Zhejiang Lab[2022RC0AB02] |
WOS研究方向 | Computer Science ; Engineering |
语种 | 英语 |
出版者 | IEEE COMPUTER SOC |
WOS记录号 | WOS:001130146400117 |
资助机构 | National Key Research and Development Plan of China ; National Natural Science Foundation of China ; Beijing Natural Science Foundation ; Open Research Projects of Zhejiang Lab |
源URL | [http://ir.ia.ac.cn/handle/173211/55539] |
专题 | 多模态人工智能系统全国重点实验室 |
通讯作者 | Xu, Changsheng |
作者单位 | 1.Peng Cheng Lab, Shenzhen 518055, Peoples R China 2.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 101408, Peoples R China 3.Chinese Acad Sci, Inst Automat, State Key Lab Multi modal Artificial Intelligence, Beijing 100190, Peoples R China |
推荐引用方式 GB/T 7714 | Gao, Junyu,Chen, Mengyuan,Xu, Changsheng. Vectorized Evidential Learning for Weakly-Supervised Temporal Action Localization[J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,2023,45(12):15949-15963. |
APA | Gao, Junyu,Chen, Mengyuan,&Xu, Changsheng.(2023).Vectorized Evidential Learning for Weakly-Supervised Temporal Action Localization.IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,45(12),15949-15963. |
MLA | Gao, Junyu,et al."Vectorized Evidential Learning for Weakly-Supervised Temporal Action Localization".IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 45.12(2023):15949-15963. |
入库方式: OAI收割
来源:自动化研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。