Multi-Sensor based Human Activity Recognition
Dataset Description Multi-sensor data collection has been done in June,2021 at Coventry University. In this data collection, three types of sensors (Radar, InfraRed and Acoustic) were fused together by a MATLAB code. There were one sensor for radar and acoustic sensors each and three IR (Grid-Eye)...
Saved in:
Main Authors | , , |
---|---|
Format | Data Set |
Language | English |
Published |
Zenodo
08.01.2025
|
Online Access | Get full text |
ISSN | 2504-4990 2504-4990 |
DOI | 10.5281/zenodo.14613210 |
Cover
Summary: | Dataset Description
Multi-sensor data collection has been done in June,2021 at Coventry University. In this data collection, three types of sensors (Radar, InfraRed and Acoustic) were fused together by a MATLAB code. There were one sensor for radar and acoustic sensors each and three IR (Grid-Eye) sensors were integrated together to eliminate limitation of a single sensor and to get maximum benefit of Multi-Sensor Human Activity Detection.
Overall, 11 subjects took part in the data-collection process which were mainly post graduate researched and academics. Collected dataset is novelty in itself as in this experiment, a series of human activities were performed rather than performing a single activity each time.
The experiment was designed carefully by keeping elderly people in the mind. Each series comprise of some day-to-day activities such as walking, sitting, talking and so on. It also has some situation which need attention such as fall, asking for help and so on.
In this data collection, seven set of activities were recorded among which series one to three were single subject and series four to seven were dual subject activities. Each series was performed ten times by each subject. Each series of activities were performed ten times by each subject.
Description of each series is given below.
Series
Number of Subject
Activity
Series-1
Single
Sit(talking)+SitTostanding(help)+walking(leftToright)(caughing)+falling(screaming)
Series-2
Single
Bending (Pickup Food) +walk (Right to Left) (coughing)+stand to sit(talking)+sit while eating
Series-3
Single
walking corner left diagonally to corner right (drop a metal spoon) +return (diagonally +help) +bending to take the spoon(talk)+standing from Bending(scream)+walking to the original corner (drop the spoon)
In the folder of Data_Collection, seven folders of Series-1 to Series-7 are present. In each folder, number of series are present which were performed by each subject 10 times. Each performed series has a folder of sensor data in it. The content of sensor data and its description is given below.
Sr No.
File Name
Type
Description
1
AudioFiles (Folder)
Wav File
This data is collected by UMA-16 Acoustic sensor which captures sound while performing activity series
2
AcousticData
MATLAB File
It is in the form of 16 Channel data which has numerical values collected from Acoustic Sensor
3
GridEyeData
MATLAB File
Three data files data1, data2 and data3 from three GridEye Sensors were collected and it has time stamps for each frame captured in the form of t1, t2 and t3. Please refer GridEye_Read for the readable form of data1, data2 and data3
4
RadarData
MATLAB Data
It has range and noise data (rpVar, npVar), RangeDopplerMatrix (no. of Framesx16x256) and rangeDopplerVarArray (Frame No.x256)
5
Miscellaneous Data
MATLAB File
This folder has important information such as drivers name (IR Sensor), frame length, sampling frequency, folder path, current time, total experiment time and so on.
Data of each sensor can be read in MATLAB and can also be visualised. It can also be converted in python data file. This pre-processing will be done before data analysis.
|
---|---|
Bibliography: | 2504-4990 |
ISSN: | 2504-4990 2504-4990 |
DOI: | 10.5281/zenodo.14613210 |