Automatic estimation of clothing insulation rate and metabolic rate for dynamic thermal comfort assessment

Existing heating, ventilation, and air-conditioning systems have difficulties in considering occupants’ dynamic thermal needs, thus resulting in overheating or overcooling with huge energy waste. This situation emphasizes the importance of occupant-oriented microclimate control where dynamic individ...

Full description

Saved in:
Bibliographic Details
Published inPattern analysis and applications : PAA Vol. 25; no. 3; pp. 619 - 634
Main Authors Liu, Jinsong, Foged, Isak Worre, Moeslund, Thomas B.
Format Journal Article
LanguageEnglish
Published London Springer London 01.08.2022
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Existing heating, ventilation, and air-conditioning systems have difficulties in considering occupants’ dynamic thermal needs, thus resulting in overheating or overcooling with huge energy waste. This situation emphasizes the importance of occupant-oriented microclimate control where dynamic individual thermal comfort assessment is the key. Therefore, in this paper, a vision-based approach to estimate individual clothing insulation rate ( I cl ) and metabolic rate ( M ), the two critical factors to assess personal thermal comfort level, is proposed. Specifically, with a thermal camera as the input source, a convolutional neural network (CNN) is implemented to recognize an occupant’s clothes type and activity type simultaneously. The clothes type then helps to differentiate the skin region from the clothing-covered region, allowing to calculate the skin temperature and the clothes temperature. With the two recognized types and the two computed temperatures, I cl and M can be estimated effectively. In the experimental phase, a novel thermal dataset is introduced, which allows evaluations of the CNN-based recognizer module, the skin and clothes temperatures acquisition module, as well as the I cl and M estimation module, proving the effectiveness and automation of the proposed approach.
ISSN:1433-7541
1433-755X
DOI:10.1007/s10044-021-00961-5