Multimodal integration of micro-Doppler sonar and auditory signals for behavior classification with convolutional networks

The ability to recognize the behavior of individuals is of great interest in the general field of safety (e.g. building security, crowd control, transport analysis, independent living for the elderly). Here we report a new real-time acoustic system for human action and behavior recognition that inte...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of neural systems Vol. 23; no. 5; p. 1350021
Main Authors Dura-Bernal, Salvador, Garreau, Guillaume, Georgiou, Julius, Andreou, Andreas G, Denham, Susan L, Wennekers, Thomas
Format Journal Article
LanguageEnglish
Published Singapore 01.10.2013
Subjects
Online AccessGet more information

Cover

Loading…
More Information
Summary:The ability to recognize the behavior of individuals is of great interest in the general field of safety (e.g. building security, crowd control, transport analysis, independent living for the elderly). Here we report a new real-time acoustic system for human action and behavior recognition that integrates passive audio and active micro-Doppler sonar signatures over multiple time scales. The system architecture is based on a six-layer convolutional neural network, trained and evaluated using a dataset of 10 subjects performing seven different behaviors. Probabilistic combination of system output through time for each modality separately yields 94% (passive audio) and 91% (micro-Doppler sonar) correct behavior classification; probabilistic multimodal integration increases classification performance to 98%. This study supports the efficacy of micro-Doppler sonar systems in characterizing human actions, which can then be efficiently classified using ConvNets. It also demonstrates that the integration of multiple sources of acoustic information can significantly improve the system's performance.
ISSN:0129-0657
DOI:10.1142/S0129065713500214