Evaluation of Navigation Sensors in Fire Smoke Environments

An experimental study was performed to quantify the performance of eleven common robotic navigation rangefinding technologies and camera systems in fire smoke environments. Instruments evaluated included two IR cameras, two visible cameras, two sonar systems, radar, a single-echo LIDAR, a multi-echo...

Full description

Saved in:
Bibliographic Details
Published inFire technology Vol. 50; no. 6; pp. 1459 - 1481
Main Authors Starr, Joseph W., Lattimer, B. Y.
Format Journal Article
LanguageEnglish
Published Boston Springer US 01.11.2014
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:An experimental study was performed to quantify the performance of eleven common robotic navigation rangefinding technologies and camera systems in fire smoke environments. Instruments evaluated included two IR cameras, two visible cameras, two sonar systems, radar, a single-echo LIDAR, a multi-echo LIDAR, a Kinect™ depth sensor, and night vision. Small-scale smoke layer experiments were performed to isolate the effect of smoke visibility and gas temperature on instrument performance. Dense, low temperature smoke tests were used to evaluate instrument performance as the smoke visibility dropped below 1 m while the smoke temperature remained below 100°C. Light, high temperature smoke tests were used to evaluate instrument performance as the smoke reached a temperature above 250°C with the visibility above 5 m. Results from the tests show that radar systems and infrared cameras outperform the other rangefinders and cameras tested for these scenarios. A series of large-scale experiments were then performed to locate objects in a smoke filled room and hallway. Distances from the LIDAR were subject to error when the visibility reduced below 4 m. Infrared stereo vision and radar could locate the distance to target objects immersed in the smoke to within 10% and 1%, respectively, independent of smoke visibility level.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:0015-2684
1572-8099
DOI:10.1007/s10694-013-0356-3