Handling Data Scarcity Through Data Augmentation in Training of Deep Neural Networks for 3D Data Processing

Due to the availability of cheap 3D sensors such as Kinect and LiDAR, the use of 3D data in various domains such as manufacturing, healthcare, and retail to achieve operational safety, improved outcomes, and enhanced customer experience has gained momentum in recent years. In many of these domains,...

Full description

Saved in:
Bibliographic Details
Published inInternational journal on semantic web and information systems Vol. 18; no. 1; pp. 1 - 16
Main Authors Srivastava, Akhilesh Mohan, Rotte, Priyanka Ajay, Jain, Arushi, Prakash, Surya
Format Journal Article
LanguageEnglish
Published Hershey IGI Global 01.01.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Due to the availability of cheap 3D sensors such as Kinect and LiDAR, the use of 3D data in various domains such as manufacturing, healthcare, and retail to achieve operational safety, improved outcomes, and enhanced customer experience has gained momentum in recent years. In many of these domains, object recognition is being performed using 3D data against the difficulties posed by illumination, pose variation, scaling, etc present in 2D data. In this work, we propose three data augmentation techniques for 3D data in point cloud representation that use sub-sampling. We then verify that the 3D samples created through data augmentation carry the same information by comparing the Iterative Closest Point Registration Error within the sub-samples, between the sub-samples and their parent sample, between the sub-samples with different parents and the same subject, and finally, between the sub-samples of different subjects. We also verify that the augmented sub-samples have the same characteristics and features as those of the original 3D point cloud by applying the Central Limit Theorem.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1552-6283
1552-6291
DOI:10.4018/IJSWIS.297038