CT-based data generation for foreign object detection on a single X-ray projection
Although X-ray imaging is used routinely in industry for high-throughput product quality control, its capability to detect internal defects has strong limitations. The main challenge stems from the superposition of multiple object features within a single X-ray view. Deep Convolutional neural networ...
Saved in:
Published in | Scientific reports Vol. 13; no. 1; p. 1881 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
London
Nature Publishing Group UK
02.02.2023
Nature Publishing Group Nature Portfolio |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Although X-ray imaging is used routinely in industry for high-throughput product quality control, its capability to detect internal defects has strong limitations. The main challenge stems from the superposition of multiple object features within a single X-ray view. Deep Convolutional neural networks can be trained by annotated datasets of X-ray images to detect foreign objects in real-time. However, this approach depends heavily on the availability of a large amount of data, strongly hampering the viability of industrial use with high variability between batches of products. We present a computationally efficient, CT-based approach for creating artificial single-view X-ray data based on just a few physically CT-scanned objects. By algorithmically modifying the CT-volume, a large variety of training examples is obtained. Our results show that applying the generative model to a single CT-scanned object results in image analysis accuracy that would otherwise be achieved with scans of tens of real-world samples. Our methodology leads to a strong reduction in training data needed, improved coverage of the combinations of base and foreign objects, and extensive generalizability to additional features. Once trained on just a single CT-scanned object, the resulting deep neural network can detect foreign objects in real-time with high accuracy. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 2045-2322 2045-2322 |
DOI: | 10.1038/s41598-023-29079-w |