Markerless motion evaluation via OpenPose and fuzzy activity evaluator

Human motion evaluation techniques can be categorized as marker-based and markerless-based. The former can be time-consuming and expensive, while the latter has the advantage of being online, fast, and noninvasive. This study presents the development of a markerless-based system by means of OpenPose...

Full description

Saved in:
Bibliographic Details
Published inJournal of the Chinese Institute of Engineers Vol. 45; no. 8; pp. 697 - 705
Main Authors Hsiao, Kai, Liu, Tung-Kuan, Lin, Paul P.
Format Journal Article
LanguageEnglish
Published Taylor & Francis 17.11.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Human motion evaluation techniques can be categorized as marker-based and markerless-based. The former can be time-consuming and expensive, while the latter has the advantage of being online, fast, and noninvasive. This study presents the development of a markerless-based system by means of OpenPose, Python codes, and fuzzy inference for push-up motion evaluation. The markerless technique generally produced good accuracy in locating human joint centers and formed the skeleton. A fuzzy activity evaluator was introduced to help resolve possible uncertainty occurring at the elbow joint due to possible abduction or adduction. The fuzzy evaluator used degree of truth for each of all three joint angles to consider the overall quality of motion. With the aid of fuzzy inference, the evaluation system produced satisfactory and reliable results by taking both qualitative and quantitative measures into account. The presented technique demonstrated how a combination of hard computing and soft computing can be a powerful tool for developing a reliable motion evaluation. Notably, the developed system used low-cost hardware and open-source software. The technique can also be applied to evaluation of rehabilitation exercise, or industrial safety, such as instantly identifying improper or dangerous motions in production.
ISSN:0253-3839
2158-7299
DOI:10.1080/02533839.2022.2126404