A nondestructive automatic defect detection method with pixelwise segmentation
文献类型:期刊论文
作者 | Yang, Lei1,3; Fan, Junfeng2; Huo, Benyan1,3; Li, En2; Liu, Yanhong1,3 |
刊名 | KNOWLEDGE-BASED SYSTEMS |
出版日期 | 2022-04-22 |
卷号 | 242页码:12 |
ISSN号 | 0950-7051 |
关键词 | Defect detection Deep architecture Image segmentation Attention fusion Residual dense connection convolution network |
DOI | 10.1016/j.knosys.2022.108338 |
通讯作者 | Huo, Benyan(huoby@zzu.edu.cn) |
英文摘要 | Defect detection is essential for the quality control and repair decision-making of various products. Due to collisions, uneven stress, welding parameters and other factors, cracks form on the surface or inside of products, which affect the product appearance and mechanism strength and may even cause huge safety accidents. Nondestructive testing (NDT) is an effective and practical method for accurate defect detection, but it still faces various challenges against complex factors, such as complex backgrounds, poor contrast, weak texture, and class imbalance issues. Recently, deep learning has rapidly improved the performance of automatic defect detection with the strong feature expression ability of deep convolutional neural networks (DCNNs). However, various limitations remain due to the insufficient processing of local contextual features, which affects the detection precision. To address this issue, with the encoder-decoder network structure, a novel nondestructive defect detection network, namely, NDD-Net, is proposed in this paper to construct an end-to-end nondestructive defect segmentation scheme. To make the segmentation network better emphasize the defect areas, an attention fusion block (AFB) is proposed to replace the raw skip connections to acquire more discriminative features and enhance the segmentation performance on microdefects. Meanwhile, by fusing a dense connection convolution network and a residual network, a residual dense connection convolution block (RDCCB) is also proposed to be embedded into the proposed segmentation network to acquire richer information about the local feature maps. Two public datasets with severe class imbalance issues are adopted for model evaluation: the Grima X-ray (GDXray) database and the rail surface discrete defects (RSSDs) dataset. Experimental results show that the proposed segmentation network outperforms other related segmentation models.(c) 2022 Elsevier B.V. All rights reserved. |
WOS关键词 | SURFACE ; INSPECTION ; SYSTEM ; MODEL |
资助项目 | National Natural Science Foundation of China[62003309] ; National Key Research & Development Project of China[2020YFB1313701] ; Science & Technology Research Project in Henan Province of China[202102210098] ; Outstanding Foreign Scientist Support Project in Henan Province of China[GZS2019008] |
WOS研究方向 | Computer Science |
语种 | 英语 |
出版者 | ELSEVIER |
WOS记录号 | WOS:000788138900008 |
资助机构 | National Natural Science Foundation of China ; National Key Research & Development Project of China ; Science & Technology Research Project in Henan Province of China ; Outstanding Foreign Scientist Support Project in Henan Province of China |
源URL | [http://ir.ia.ac.cn/handle/173211/48406] |
专题 | 自动化研究所_复杂系统管理与控制国家重点实验室_先进机器人控制团队 复杂系统管理与控制国家重点实验室_水下机器人 |
通讯作者 | Huo, Benyan |
作者单位 | 1.Zhengzhou Univ, Sch Elect Engn, Zhengzhou 450001, Peoples R China 2.Chinese Acad Sci, Inst Automat, State Key Lab Management & Control Complex Syst, Beijing 100190, Peoples R China 3.Robot Percept & Control Engn Lab, Zhengzhou 450001, Henan, Peoples R China |
推荐引用方式 GB/T 7714 | Yang, Lei,Fan, Junfeng,Huo, Benyan,et al. A nondestructive automatic defect detection method with pixelwise segmentation[J]. KNOWLEDGE-BASED SYSTEMS,2022,242:12. |
APA | Yang, Lei,Fan, Junfeng,Huo, Benyan,Li, En,&Liu, Yanhong.(2022).A nondestructive automatic defect detection method with pixelwise segmentation.KNOWLEDGE-BASED SYSTEMS,242,12. |
MLA | Yang, Lei,et al."A nondestructive automatic defect detection method with pixelwise segmentation".KNOWLEDGE-BASED SYSTEMS 242(2022):12. |
入库方式: OAI收割
来源:自动化研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。