中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
JOINT MULTI-TASK LEARNING FOR SURVIVAL PREDICTION OF GASTRIC CANCER PATIENTS USING CT IMAGES

文献类型:会议论文

作者Liwen Zhang1,5; Di Dong1,5; Zaiyi Liu4; Junlin Zhou3; Jie Tian1,2
出版日期2021
会议日期2021-4
会议地点线上会议
英文摘要

Accurate pre-operative overall survival (OS) prediction of gastric patients is of great significance for personalized treatment. However, the accuracy of OS prediction has been limited by existing methods. To facilitate improvement of survival prediction, we propose a novel joint multi-task network equipped with multi-level features simultaneously predicting clinical tumor and node stages. Two independent datasets including a training set (377 patients) and a test set (122 patients) are used to evaluate our proposed network. The results indicated that the multi-task network exploits its recipe by capturing multi-level features, and sharing prognostic information from correlated tasks of clinical stages prediction, which enable our network to predict OS accurately. Our method outperforms the existing methods with the highest c-index (training: 0.73; test: 0.72). Meanwhile, our method shows better prognostic value with the highest hazard ratio (training: 3.77; test: 4.28) for dividing patients into high- and low-risk groups.

会议录出版者IEEE
源URL[http://ir.ia.ac.cn/handle/173211/57482]  
专题自动化研究所_中国科学院分子影像重点实验室
通讯作者Jie Tian
作者单位1.CAS Key Laboratory of Molecular Imaging, Institute of Automation, Chinese Academy of Sciences
2.Beijing Advanced Innovation Center for Big Data−Based Precision Medicine, School of Medicine,Beihang University
3.Department of Radiology, Lanzhou University Second Hospital
4.Department of Radiology, Guangdong General Hospital
5.School of Artificial Intelligence, University of Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Liwen Zhang,Di Dong,Zaiyi Liu,et al. JOINT MULTI-TASK LEARNING FOR SURVIVAL PREDICTION OF GASTRIC CANCER PATIENTS USING CT IMAGES[C]. 见:. 线上会议. 2021-4.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。