Real-Time Multi-Person Identification and Tracking via HPE and IMU Data Fusion

In the context of smart environments, crafting remote monitoring systems that are efficient, cost-effective, user-friendly, and respectful of privacy is crucial for many scenarios. Recognizing and tracing individuals via markerless motion capture systems in multi-person settings poses challenges due...

Full description

Saved in:
Bibliographic Details
Published inProceedings - Design, Automation, and Test in Europe Conference and Exhibition pp. 1 - 6
Main Authors De Marchi, Mirco, Turetta, Cristian, Pravadelli, Graziano, Bombieri, Nicola
Format Conference Proceeding
LanguageEnglish
Published EDAA 25.03.2024
Subjects
Online AccessGet full text
ISSN1558-1101
DOI10.23919/DATE58400.2024.10546744

Cover

Loading…
More Information
Summary:In the context of smart environments, crafting remote monitoring systems that are efficient, cost-effective, user-friendly, and respectful of privacy is crucial for many scenarios. Recognizing and tracing individuals via markerless motion capture systems in multi-person settings poses challenges due to obstructions, varying light conditions, and intricate interactions among subjects. In contrast, methods based on data gathered by Inertial Measurement Units (IMUs) located in wearables grapple with other issues, including the precision of the sensors and their optimal placement on the body. We claim that more accurate results can be achieved by mixing Human Pose Estimation (HPE) techniques with information collected by wearables. To do that, we introduce a real-time platform that fuses HPE and IMU data to track and identify people. It exploits a matching model that consists of two synergistic components: the first employs a geometric approach, correlating orientation, acceleration, and velocity readings from the input sources. The second utilizes a Convolutional Neural Network (CNN) to yield a correlation coefficient for each HPE and IMU data pair. The proposed platform achieves promising results in identification and tracking, with an accuracy rate of 96.9%.
ISSN:1558-1101
DOI:10.23919/DATE58400.2024.10546744