Dissertation/Thesis Abstract

Real-Time Individual Thermal Preferences Prediction Using Visual Sensors
by Cosma, Andrei Claudiu, Ph.D., The George Washington University, 2019, 147; 13422566
Abstract (Summary)

The thermal comfort of a building’s occupants is an important aspect of building design. Providing an increased level of thermal comfort is critical given that humans spend the majority of the day indoors, and that their well-being, productivity, and comfort depend on the quality of these environments. In today’s world, Heating, Ventilation, and Air Conditioning (HVAC) systems deliver heated or cooled air based on a fixed operating point or target temperature; individuals or building managers are able to adjust this operating point through human communication of dissatisfaction. Currently, there is a lack in automatic detection of an individual’s thermal preferences in real-time, and the integration of these measurements in an HVAC system controller.

To achieve this, a non-invasive approach to automatically predict personal thermal comfort and the mean time to discomfort in real-time is proposed and studied in this thesis. The goal of this research is to explore the consequences of human body thermoregulation on skin temperature and tone as a means to predict thermal comfort. For this reason, the temperature information extracted from multiple local body parts, and the skin tone information extracted from the face will be investigated as a means to model individual thermal preferences.

In a first study, we proposed a real-time system for individual thermal preferences prediction in transient conditions using temperature values from multiple local body parts. The proposed solution consists of a novel visual sensing platform, which we called RGB-DT, that fused information from three sensors: a color camera, a depth sensor, and a thermographic camera. This platform was used to extract skin and clothing temperature from multiple local body parts in real-time. Using this method, personal thermal comfort was predicted with more than 80% accuracy, while mean time to warm discomfort was predicted with more than 85% accuracy.

In a second study, we introduced a new visual sensing platform and method that uses a single thermal image of the occupant to predict personal thermal comfort. We focused on close-up images of the occupant’s face to extract fine-grained details of the skin temperature. We extracted manually selected features, as well as a set of automated features. Results showed that the automated features outperformed the manual features in all the tests that were run, and that these features predicted personal thermal comfort with more than 76% accuracy.

The last proposed study analyzed the thermoregulation activity at the face level to predict skin temperature in the context of thermal comfort assessment. This solution uses a single color camera to model thermoregulation based on the side effects of the vasodilatation and vasoconstriction. To achieve this, new methods to isolate skin tone response to an individual’s thermal regulation were explored. The relation between the extracted skin tone measurement and the skin temperature was analyzed using a regression model.

Our experiments showed that a thermal model generated using noninvasive and contactless visual sensors could be used to accurately predict individual thermal preferences in real-time. Therefore, instantaneous feedback with respect to the occupants' thermal comfort can be provided to the HVAC system controller to adjust the room temperature.

Indexing (document details)
Advisor: Simha, Rahul
Commitee: Levy, Renato, Pless, Robert, Wood, Timothy
School: The George Washington University
Department: Computer Science
School Location: United States -- District of Columbia
Source: DAI-B 80/04(E), Dissertation Abstracts International
Subjects: Artificial intelligence, Computer science
Keywords: Building automation, HVAC control, Skin temperature, Thermal comfort, Thermographic camera, Thermoregulation system
Publication Number: 13422566
ISBN: 978-0-438-73373-2
Copyright © 2020 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy