M3SC: A generic dataset for mixed multi-modal (MMM) sensing and communication integration

The sixth generation (6G) of mobile communication system is witnessing a new paradigm shift, i.e., integrated sensing-communication system. A comprehensive dataset is a prerequisite for 6G integrated sensing-communication research. This paper develops a novel simulation dataset, named M 3 SC, for mi...

Full description

Saved in:
Bibliographic Details
Published inChina communications Vol. 20; no. 11; pp. 13 - 29
Main Authors Cheng, Xiang, Huang, Ziwei, Bai, Lu, Zhang, Haotian, Sun, Mingran, Liu, Boxun, Li, Sijiang, Zhang, Jianan, Lee, Minson
Format Journal Article
LanguageEnglish
Published China Institute of Communications 01.11.2023
State Key Laboratory of Advanced Optical Communication Systems and Networks,School of Electronics,Peking University,Beijing 100871,China%Joint SDU-NTU Centre for Artificial Intelligence Research(C-FAIR),Shandong University,Jinan 250100,China%Ever-Florescence Technology,Nanjing 210000,China
Subjects
Online AccessGet full text
ISSN1673-5447
DOI10.23919/JCC.fa.2023-0268.202311

Cover

Loading…
More Information
Summary:The sixth generation (6G) of mobile communication system is witnessing a new paradigm shift, i.e., integrated sensing-communication system. A comprehensive dataset is a prerequisite for 6G integrated sensing-communication research. This paper develops a novel simulation dataset, named M 3 SC, for mixed multi-modal (MMM) sensing-communication integration, and the generation framework of the M 3 SC dataset is further given. To obtain multi-modal sensory data in physical space and communication data in electromagnetic space, we utilize Air- Sim and WaveFarer to collect multi-modal sensory data and exploit Wireless InSite to collect communication data. Furthermore, the in-depth integration and precise alignment of AirSim, WaveFarer, and Wireless InSite are achieved. The M 3 SC dataset covers various weather conditions, multiplex frequency bands, and different times of the day. Currently, the M 3 SC dataset contains 1500 snapshots, including 80 RGB images, 160 depth maps, 80 LiDAR point clouds, 256 sets of mmWave waveforms with 8 radar point clouds, and 72 channel impulse response (CIR) matrices per snapshot, thus totaling 120,000 RGB images, 240,000 depth maps, 120,000 LiDAR point clouds, 384,000 sets of mmWave waveforms with 12,000 radar point clouds, and 108,000 CIR matrices. The data processing result presents the multi-modal sensory information and communication channel statistical properties. Finally, the MMM sensing-communication application, which can be supported by the M 3 SC dataset, is discussed.
ISSN:1673-5447
DOI:10.23919/JCC.fa.2023-0268.202311