中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Weakly-Supervised Cross-Domain Adaptation for Endoscopic Lesions Segmentation

文献类型:期刊论文

作者Dong JH(董家华)1,2,3; Cong Y(丛杨)2,3; Sun G(孙干)2,3; Yang YS(杨云生)4; Xu XW(徐晓伟)5; Ding ZM(丁正明)6
刊名IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY
出版日期2021
卷号31期号:5页码:2020-2033
关键词Weakly-supervised learning endoscopic lesions segmentation semantic knowledge transfer domain adaptation
ISSN号1051-8215
产权排序1
英文摘要

Weakly-supervised learning has attracted growing research attention on medical lesions segmentation due to significant saving in pixel-level annotation cost. However, 1) most existing methods require effective prior and constraints to explore the intrinsic lesions characterization, which only generates incorrect and rough prediction; 2) they neglect the underlying semantic dependencies among weakly-labeled target enteroscopy diseases and fully-annotated source gastroscope lesions, while forcefully utilizing untransferable dependencies leads to the negative performance. To tackle above issues, we propose a new weakly-supervised lesions transfer framework, which can not only explore transferable domain-invariant knowledge across different datasets, but also prevent the negative transfer of untransferable representations. Specifically, a Wasserstein quantified transferability framework is developed to highlight widerange transferable contextual dependencies, while neglecting the irrelevant semantic characterizations. Moreover, a novel selfsupervised pseudo label generator is designed to equally provide confident pseudo pixel labels for both hard-to-transfer and easyto- transfer target samples. It inhibits the enormous deviation of false pseudo pixel labels under the self-supervision manner. Afterwards, dynamically-searched feature centroids are aligned to narrow category-wise distribution shift. Comprehensive theoretical analysis and experiments show the superiority of our model on the endoscopic dataset and several public datasets.

WOS关键词BREAST-LESIONS ; NETWORK ; DIAGNOSIS
资助项目Ministry of Science and Technology of China[2019YFB1310300] ; National Natural Science Foundation of China[61722311] ; National Natural Science Foundation of China[U1613214] ; National Natural Science Foundation of China[61821005] ; National Natural Science Foundation of China[61533015] ; National Postdoctoral Innovative Talents Support Program[BX20200353]
WOS研究方向Engineering
语种英语
WOS记录号WOS:000647394100027
资助机构Ministry of Science and Technology of the Peoples Republic of China (2019YFB1310300) ; National Nature Science Foundation of China under Grant (61722311, U1613214, 61821005, 61533015) ; National Postdoctoral Innovative Talents Support Program (BX20200353)
源URL[http://ir.sia.cn/handle/173321/27727]  
专题沈阳自动化研究所_机器人学研究室
通讯作者Cong Y(丛杨)
作者单位1.University of Chinese Academy of Sciences, Beijing 100049, China
2.State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang 110016, China
3.Institutes for Robotics and Intelligent Manufacturing, Chinese Academy of Sciences, Shenyang 110016, China
4.Chinese PLA General Hospital, Beijing 100000, China
5.Department of Information Science, University of Arkansas at Little Rock, Arkansas 72204, USA
6.Department of Computer, Information and Technology, Indiana University-Purdue University Indianapolis, Indianapolis,IN 46202 USA
推荐引用方式
GB/T 7714
Dong JH,Cong Y,Sun G,et al. Weakly-Supervised Cross-Domain Adaptation for Endoscopic Lesions Segmentation[J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY,2021,31(5):2020-2033.
APA Dong JH,Cong Y,Sun G,Yang YS,Xu XW,&Ding ZM.(2021).Weakly-Supervised Cross-Domain Adaptation for Endoscopic Lesions Segmentation.IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY,31(5),2020-2033.
MLA Dong JH,et al."Weakly-Supervised Cross-Domain Adaptation for Endoscopic Lesions Segmentation".IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY 31.5(2021):2020-2033.

入库方式: OAI收割

来源:沈阳自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。