Trajectory Privacy Protection on Spatial Streaming Data with Differential Privacy
Continuously sharing user's trajectory data which contain one's location information makes the crowd sensing of the traffic dynamics and mobility trends feasible. This kind of spatial streaming data is beneficial for intelligent transportation but at the risk of disclosing personal privacy...
Saved in:
Published in | 2018 IEEE Global Communications Conference (GLOBECOM) pp. 1 - 7 |
---|---|
Main Authors | , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.12.2018
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Continuously sharing user's trajectory data which contain one's location information makes the crowd sensing of the traffic dynamics and mobility trends feasible. This kind of spatial streaming data is beneficial for intelligent transportation but at the risk of disclosing personal privacy, even if it is published in statistical form such as "the number of users in an area at time t". The user number on a location at time t is similar to that of previous release on the same location, and to that on adjacent locations. Such spatio-temporal correlation makes it a challenge to find solutions to protect user's trajectory privacy. The state-of-the-art privacy protection framework, differential privacy, has been extended to streaming scenario for preventing the privacy leak causing by the temporal correlation. However, such schemes neglect the importance of spatial correlation so that they may suffer the leak of user trajectory privacy or the degradation of data utility. Based on the observation that any piece of trajectory has temporal and spatial locality, we propose a flexible trajectory privacy model of w-event n2-block differential privacy, short as (ω, n)-differential privacy, to ensure any trajectory occurring in an area of n×n blocks during w successive timestamps under the protection of ε-differential privacy. Then we design the Spatial Temporal Budget Distribution (STBD) algorithm for achieving (ω, n)-differential privacy. Validation results of this algorithm on two real-life datasets and one synthetic dataset confirm its practicality. |
---|---|
ISSN: | 2576-6813 |
DOI: | 10.1109/GLOCOM.2018.8647918 |