Consistent response for automated multilabel thoracic disease classification

Summary While recent studies on automated multilabel chest X‐ray (CXR) images classification have shown remarkable progress in leveraging complicated network and attention mechanisms, the automated detection on chest radiographs is still challenging because the pathological patterns are usually high...

Full description

Saved in:
Bibliographic Details
Published inConcurrency and computation Vol. 34; no. 23
Main Authors Su, Jiawei, Luo, Zhiming, Li, Shaozi
Format Journal Article
LanguageEnglish
Published Hoboken Wiley Subscription Services, Inc 25.10.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Summary While recent studies on automated multilabel chest X‐ray (CXR) images classification have shown remarkable progress in leveraging complicated network and attention mechanisms, the automated detection on chest radiographs is still challenging because the pathological patterns are usually highly diverse in their sizes and locations. The CNN model will suffer from the complicated background and high diversity of diseases, which reduce the generalization and performance of the model. To solve these problems, we propose a dual‐distribution consistency (DDC) model, which increases the consistency from two aspects, that is, feature‐level and label‐level. This model integrates two novel loss functions: multilabel response consistency (MRC) loss and distribution consistency (DC) loss. Specifically, we use the original image and its transformed image as inputs to imitate different views of CXR images. The MRC loss encourages the multilabel‐wise attention maps to be consistent between the original CXR image and its transformed counterpart. And the DC loss can force their output probability distributions to be uniform. In this manner, we can make sure that the model can learn discriminative features by using a different view of CXR images. Experiments conducted on the ChestX‐ray14 dataset show the effectiveness of the proposed method.
Bibliography:Funding information
China Postdoctoral ScienGuiding Project of Science and Technology Department of Fujian Provincece Foundation Grant, Grant/Award Number: 2019Y0018; National Natural Science Foundation of China, Grant/Award Numbers: 61806172; 61876159; 62076116
ISSN:1532-0626
1532-0634
DOI:10.1002/cpe.7201