Subject-adaptive Skeleton Tracking with RFID
With the rapid development of computer vision, human pose tracking has attracted increasing attention in recent years. To address the privacy concerns, it is desirable to develop techniques without using a video camera. To this end, RFID tags can be used as a low-cost wearable sensor to provide an e...
Saved in:
Published in | 2020 16th International Conference on Mobility, Sensing and Networking (MSN) pp. 599 - 606 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.12.2020
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | With the rapid development of computer vision, human pose tracking has attracted increasing attention in recent years. To address the privacy concerns, it is desirable to develop techniques without using a video camera. To this end, RFID tags can be used as a low-cost wearable sensor to provide an effective solution for 3D human pose tracking. User adaptability is another big challenge in RF based pose tracking, i.e., how to use a well-trained model for untrained subjects. In this paper, we propose Cycle-Pose, a subject-adaptive realtime 3D human pose estimation system, which is based on deep learning and assisted by computer vision for model training. In Cycle-Pose, RFID phase data is calibrated to effectively mitigate the severe phase distortion, and High Accuracy LowRank Tensor Completion (HaLRTC) is employed to impute missing RFID data. A cycle kinematic network is proposed to remove the restriction on paired RFID and vision data for model training. The resulting system is subject-adaptive, achieved by learning to transform the RFID data into a human skeleton for different subjects. A prototype system is developed with commodity RFID tags/devices and evaluated with experiments. Compared with a traditional system RFIDPose, higher pose estimation accuracy and subject adaptability are demonstrated by Cycle-Pose in our experiments using Kinect 2.0 data as ground truth. |
---|---|
DOI: | 10.1109/MSN50589.2020.00098 |