A context-based privacy preserving framework for wearable visual lifeloggers

The ability of wearable cameras to continuously capture the first person viewpoint with minimal user interaction, has made them very attractive in many application domains. Wearable technology today is available and useful but not widely used and accepted due to various challenges mainly privacy con...

Full description

Saved in:
Bibliographic Details
Published in2016 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops) pp. 1 - 4
Main Authors Zarepour, Eisa, Hosseini, Mohammadreza, Kanhere, Salil S., Sowmya, Arcot
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.03.2016
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The ability of wearable cameras to continuously capture the first person viewpoint with minimal user interaction, has made them very attractive in many application domains. Wearable technology today is available and useful but not widely used and accepted due to various challenges mainly privacy concerns. In this paper, we introduce a novel efficient privacy-aware framework for wearable cameras that can protect all sensitive subjects such as people, objects (e.g, display screens, license plates and credit cards) and locations (e.g, bathrooms and bedrooms). It uses the contextual information obtained from the wearable's sensors and recorded images to identify the potential sensitive subjects in each image. Using image processing techniques, first the sensitive subjects are recognized and then blurred or eliminated. Finally the edited image is ready to publish. Using a practical case study, we experimentally evaluate the proposed framework. Our experimental results show that the proposed system can detect and blur the sensitive subjects among around 300 images with around 70% accuracy.
DOI:10.1109/PERCOMW.2016.7457057