BgNet: Classification of benign and malignant tumors with MRI multi-plane attention learning
文献类型:期刊论文
作者 | Liu, Hong6; Jiao, Meng-Lei5,6; Xing, Xiao-Ying4; Ou-Yang, Han-Qiang1,2,3; Yuan, Yuan4; Liu, Jian-Fang4; Li, Yuan4; Wang, Chun-Jie4; Lang, Ning4; Qian, Yue-Liang6 |
刊名 | FRONTIERS IN ONCOLOGY
![]() |
出版日期 | 2022-10-27 |
卷号 | 12页码:12 |
关键词 | tumor classification deep learning multi-plane fusion benign malignant |
ISSN号 | 2234-943X |
DOI | 10.3389/fonc.2022.971871 |
英文摘要 | ObjectivesTo propose a deep learning-based classification framework, which can carry out patient-level benign and malignant tumors classification according to the patient's multi-plane images and clinical information. MethodsA total of 430 cases of spinal tumor, including axial and sagittal plane images by MRI, of which 297 cases for training (14072 images), and 133 cases for testing (6161 images) were included. Based on the bipartite graph and attention learning, this study proposed a multi-plane attention learning framework, BgNet, for benign and malignant tumor diagnosis. In a bipartite graph structure, the tumor area in each plane is used as the vertex of the graph, and the matching between different planes is used as the edge of the graph. The tumor areas from different plane images are spliced at the input layer. And based on the convolutional neural network ResNet and visual attention learning model Swin-Transformer, this study proposed a feature fusion model named ResNetST for combining both global and local information to extract the correlation features of multiple planes. The proposed BgNet consists of five modules including a multi-plane fusion module based on the bipartite graph, input layer fusion module, feature layer fusion module, decision layer fusion module, and output module. These modules are respectively used for multi-level fusion of patient multi-plane image data to realize the comprehensive diagnosis of benign and malignant tumors at the patient level. ResultsThe accuracy (ACC: 79.7%) of the proposed BgNet with multi-plane was higher than that with a single plane, and higher than or equal to the four doctors' ACC (D1: 70.7%, p=0.219; D2: 54.1%, p<0.005; D3: 79.7%, p=0.006; D4: 72.9%, p=0.178). Moreover, the diagnostic accuracy and speed of doctors can be further improved with the aid of BgNet, the ACC of D1, D2, D3, and D4 improved by 4.5%, 21.8%, 0.8%, and 3.8%, respectively. ConclusionsThe proposed deep learning framework BgNet can classify benign and malignant tumors effectively, and can help doctors improve their diagnostic efficiency and accuracy. The code is available at https://github.com/research-med/BgNet. |
资助项目 | Beijing Natural Science Foundation ; National Natural Science Foundation of China ; Capital's Funds for Health Improvement and Research ; [Z190020] ; [62276250] ; [82171927] ; [81971578] ; [2020-4-40916] |
WOS研究方向 | Oncology |
语种 | 英语 |
WOS记录号 | WOS:000885875900001 |
出版者 | FRONTIERS MEDIA SA |
源URL | [http://119.78.100.204/handle/2XEOYT63/19888] ![]() |
专题 | 中国科学院计算技术研究所期刊论文 |
通讯作者 | Liu, Hong; Jiang, Liang; Yuan, Hui-Shu |
作者单位 | 1.Beijing Key Lab Spinal Dis Res, Beijing, Peoples R China 2.Engn Res Ctr Bone & Joint Precis Med, Beijing, Peoples R China 3.Peking Univ Third Hosp, Dept Orthopaed, Beijing, Peoples R China 4.Peking Univ Third Hosp, Dept Radiol, Beijing, Peoples R China 5.Univ Chinese Acad Sci, Beijing, Peoples R China 6.Chinese Acad Sci, Inst Comp Technol, Beijing Key Lab Mobile Comp & Pervas Device, Beijing, Peoples R China |
推荐引用方式 GB/T 7714 | Liu, Hong,Jiao, Meng-Lei,Xing, Xiao-Ying,et al. BgNet: Classification of benign and malignant tumors with MRI multi-plane attention learning[J]. FRONTIERS IN ONCOLOGY,2022,12:12. |
APA | Liu, Hong.,Jiao, Meng-Lei.,Xing, Xiao-Ying.,Ou-Yang, Han-Qiang.,Yuan, Yuan.,...&Wang, Xiang-Dong.(2022).BgNet: Classification of benign and malignant tumors with MRI multi-plane attention learning.FRONTIERS IN ONCOLOGY,12,12. |
MLA | Liu, Hong,et al."BgNet: Classification of benign and malignant tumors with MRI multi-plane attention learning".FRONTIERS IN ONCOLOGY 12(2022):12. |
入库方式: OAI收割
来源:计算技术研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。