MoCap-Based Adaptive Human-Like Walking Simulation in Laser-Scanned Large-Scale as-Built Environments

Accessibility evaluation to enhance accessibility and safety for the elderly and disabled is increasing in importance. Accessibility must be assessed not only from the general standard aspect but also in terms of physical and cognitive friendliness for users of different ages, genders, and abilities...

Full description

Saved in:
Bibliographic Details
Published inDigital Human Modeling. Applications in Health, Safety, Ergonomics and Risk Management: Ergonomics and Health pp. 193 - 204
Main Authors Maruyama, Tsubasa, Kanai, Satoshi, Date, Hiroaki
Format Book Chapter
LanguageEnglish
Published Cham Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Accessibility evaluation to enhance accessibility and safety for the elderly and disabled is increasing in importance. Accessibility must be assessed not only from the general standard aspect but also in terms of physical and cognitive friendliness for users of different ages, genders, and abilities. Human behavior simulation has been progressing in crowd behavior analysis and emergency evacuation planning. This research aims to develop a virtual accessibility evaluation by combining realistic human behavior simulation using a digital human model (DHM) with as-built environmental models. To achieve this goal, we developed a new algorithm for generating human-like DHM walking motions, adapting its strides and turning angles to laser-scanned as-built environments using motion-capture (MoCap) data of flat walking. Our implementation quickly constructed as-built three-dimensional environmental models and produced a walking simulation speed sufficient for real-time applications. The difference in joint angles between the DHM and MoCap data was sufficiently small. Demonstrations of our environmental modeling and walking simulation in an indoor environment are illustrated.
ISBN:3319210696
9783319210698
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-21070-4_20