Multimodal In-bed Pose and Shape Estimation under the Blankets

Humans spend vast hours in bed -- about one-third of the lifetime on average. Besides, a human at rest is vital in many healthcare applications. Typically, humans are covered by a blanket when resting, for which we propose a multimodal approach to uncover the subjects so their bodies at rest can be...

Full description

Saved in:
Bibliographic Details
Main Authors Yin, Yu, Robinson, Joseph P, Fu, Yun
Format Journal Article
LanguageEnglish
Published 12.12.2020
Subjects
Online AccessGet full text
DOI10.48550/arxiv.2012.06735

Cover

Loading…
Abstract Humans spend vast hours in bed -- about one-third of the lifetime on average. Besides, a human at rest is vital in many healthcare applications. Typically, humans are covered by a blanket when resting, for which we propose a multimodal approach to uncover the subjects so their bodies at rest can be viewed without the occlusion of the blankets above. We propose a pyramid scheme to effectively fuse the different modalities in a way that best leverages the knowledge captured by the multimodal sensors. Specifically, the two most informative modalities (i.e., depth and infrared images) are first fused to generate good initial pose and shape estimation. Then pressure map and RGB images are further fused one by one to refine the result by providing occlusion-invariant information for the covered part, and accurate shape information for the uncovered part, respectively. However, even with multimodal data, the task of detecting human bodies at rest is still very challenging due to the extreme occlusion of bodies. To further reduce the negative effects of the occlusion from blankets, we employ an attention-based reconstruction module to generate uncovered modalities, which are further fused to update current estimation via a cyclic fashion. Extensive experiments validate the superiority of the proposed model over others.
AbstractList Humans spend vast hours in bed -- about one-third of the lifetime on average. Besides, a human at rest is vital in many healthcare applications. Typically, humans are covered by a blanket when resting, for which we propose a multimodal approach to uncover the subjects so their bodies at rest can be viewed without the occlusion of the blankets above. We propose a pyramid scheme to effectively fuse the different modalities in a way that best leverages the knowledge captured by the multimodal sensors. Specifically, the two most informative modalities (i.e., depth and infrared images) are first fused to generate good initial pose and shape estimation. Then pressure map and RGB images are further fused one by one to refine the result by providing occlusion-invariant information for the covered part, and accurate shape information for the uncovered part, respectively. However, even with multimodal data, the task of detecting human bodies at rest is still very challenging due to the extreme occlusion of bodies. To further reduce the negative effects of the occlusion from blankets, we employ an attention-based reconstruction module to generate uncovered modalities, which are further fused to update current estimation via a cyclic fashion. Extensive experiments validate the superiority of the proposed model over others.
Author Yin, Yu
Fu, Yun
Robinson, Joseph P
Author_xml – sequence: 1
  givenname: Yu
  surname: Yin
  fullname: Yin, Yu
– sequence: 2
  givenname: Joseph P
  surname: Robinson
  fullname: Robinson, Joseph P
– sequence: 3
  givenname: Yun
  surname: Fu
  fullname: Fu, Yun
BackLink https://doi.org/10.48550/arXiv.2012.06735$$DView paper in arXiv
BookMark eNrjYmDJy89LZWCQNDTQM7EwNTXQTyyqyCzTMzIwNNIzMDM3NuVksPMtzSnJzM1PScxR8MzTTUpNUQjIL05VSMxLUQjOSCxIVXAtBsonlmTm5ymU5qWkFimUZKQqOOUk5mWnlhTzMLCmJeYUp_JCaW4GeTfXEGcPXbBN8QVFQK1FlfEgG-PBNhoTVgEAZcQ3EA
ContentType Journal Article
Copyright http://arxiv.org/licenses/nonexclusive-distrib/1.0
Copyright_xml – notice: http://arxiv.org/licenses/nonexclusive-distrib/1.0
DBID AKY
GOX
DOI 10.48550/arxiv.2012.06735
DatabaseName arXiv Computer Science
arXiv.org
DatabaseTitleList
Database_xml – sequence: 1
  dbid: GOX
  name: arXiv.org
  url: http://arxiv.org/find
  sourceTypes: Open Access Repository
DeliveryMethod fulltext_linktorsrc
ExternalDocumentID 2012_06735
GroupedDBID AKY
GOX
ID FETCH-arxiv_primary_2012_067353
IEDL.DBID GOX
IngestDate Tue Jul 22 23:59:31 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-arxiv_primary_2012_067353
OpenAccessLink https://arxiv.org/abs/2012.06735
ParticipantIDs arxiv_primary_2012_06735
PublicationCentury 2000
PublicationDate 2020-12-12
PublicationDateYYYYMMDD 2020-12-12
PublicationDate_xml – month: 12
  year: 2020
  text: 2020-12-12
  day: 12
PublicationDecade 2020
PublicationYear 2020
Score 3.4907134
SecondaryResourceType preprint
Snippet Humans spend vast hours in bed -- about one-third of the lifetime on average. Besides, a human at rest is vital in many healthcare applications. Typically,...
SourceID arxiv
SourceType Open Access Repository
SubjectTerms Computer Science - Computer Vision and Pattern Recognition
Computer Science - Multimedia
Title Multimodal In-bed Pose and Shape Estimation under the Blankets
URI https://arxiv.org/abs/2012.06735
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV1NSwMxEB3anryIolK_5-A10GaT3e5FUGmtHlRQYW9LsklQ1G3ZXcWf7ySp6KXXzJAMCcObl8xMAM4yxyuXS83GOudEUIxmE0Jlxiueyzw1LjUh2-IunT-L20IWPcDfWhjVfL9-xf7AuiVm7q_q0iyRfehz7lO2ru-L-DgZWnGt9P_0KMYMQ_9AYrYFm6voDi_icWxDz9Y7cB6KXD8WhiQ3NdPW4MOitUgcHh9f1NLilPwslhCir-lqkKIyvHxX9Zvt2l04nU2fruYsrFguY3uI0htTBmOSPRgQibdDQC4I-kfKiSqRItN6MtIEteQuhicuE2ofhutmOVgvOoQN7vnf2H9QcgSDrvm0xwSSnT4JO_UDh4RqGw
linkProvider Cornell University
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Multimodal+In-bed+Pose+and+Shape+Estimation+under+the+Blankets&rft.au=Yin%2C+Yu&rft.au=Robinson%2C+Joseph+P&rft.au=Fu%2C+Yun&rft.date=2020-12-12&rft_id=info:doi/10.48550%2Farxiv.2012.06735&rft.externalDocID=2012_06735