Coronary vessel segmentation using multiresolution and multiscale deep learning

Zhengqiang Jiang*, Chubin Ou, Yi Qian, Rajan Rehan, Andy Yong

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

7 Downloads (Pure)

Abstract

We present a coronary vessel segmentation method for X-Ray coronary angiography images using multiresolution and multiscale deep learning. Our segmentation method constructs a set of multiresolution images from an input image via bilinear interpolation, which can handle coronary vessels with uneven distribution of contrast. We incorporate Multiresolution and Multiscale Convolution Filtering into an U-Net Network, which can help to improve accuracy of segmentation results by dealing with various thickness of coronary vessels in different positions. We investigate two types of experiments of multiresolution strategy with U-Net and multiscale strategy with U-Net, respectively. Our method has been evaluated and compared both qualitatively with networks such as single U-Net, Attention U-Net, R2U-Net and R2AttU-Net, and quantitatively with 20 state-of-the-art visual segmentation methods using a benchmark X-Ray coronary angiography database. The experiments demonstrate that our segmentation method outperforms methods using each of these networks alone and these 20 methods significantly in terms of Dice Coefficient metric, which is considered as a major evaluation criteria of segmentation results.

Original languageEnglish
Article number100602
Pages (from-to)1-9
Number of pages9
JournalInformatics in Medicine Unlocked
Volume24
DOIs
Publication statusPublished - 24 May 2021

Bibliographical note

Copyright the Author(s) 2021. Version archived for private and non-commercial use with the permission of the author/s and according to publisher conditions. For further rights please contact the publisher.

Fingerprint

Dive into the research topics of 'Coronary vessel segmentation using multiresolution and multiscale deep learning'. Together they form a unique fingerprint.

Cite this