LiteDEKR: End‐to‐end lite 2D human pose estimation network

Abstract The 2D human pose estimation plays an important role in human‐computer interaction and action recognition. Although the method based on high‐resolution network has superior performance, there is still room for improvement in terms of speed and lightweight. Here, a LiteDEKR, a 2D pose estima...

Full description

Saved in:
Bibliographic Details
Published inIET image processing Vol. 17; no. 12; pp. 3392 - 3400
Main Authors Lv, Xueqiang, Hao, Wei, Tian, Lianghai, Han, Jing, Chen, Yuzhong, Cai, Zangtai
Format Journal Article
LanguageEnglish
Published Wiley 01.10.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Abstract The 2D human pose estimation plays an important role in human‐computer interaction and action recognition. Although the method based on high‐resolution network has superior performance, there is still room for improvement in terms of speed and lightweight. Here, a LiteDEKR, a 2D pose estimation method that combines lightweight and accuracy, is proposed by designing a lightweight network based on DEKR and constructing two scientifically valid loss functions. The method, constructs a multi‐instance bias regression loss that matches the true distribution of keypoint bias, improves the accuracy of bias regression, and constructs a keypoint similarity loss with the object keypoint similarity index of keypoints as the optimization objective to achieve end‐to‐end training of the network. In addition, this paper has designed a lightweight DEKR, using LitePose as the backbone network. With the optimization of the above two loss functions, LiteDEKR not only achieves lightweight but also has high accuracy. Comparative experiments on the COCO and CrowdPose datasets show that compared to the current state‐of‐the‐art Contextual Instance Decoupling, LiteDEKR achieves a similar accuracy with only 10% of its network complexity. It also shows better robustness to low‐resolution input images.
ISSN:1751-9659
1751-9667
DOI:10.1049/ipr2.12871