Path Reconstruction in Dynamic Wireless Sensor Networks Using Compressive Sensing

This paper presents CSPR, a compressive-sensing-based approach for path reconstruction in wireless sensor networks. By viewing the whole network as a path representation space, an arbitrary routing path can be represented by a path vector in the space. As path length is usually much smaller than the...

Full description

Saved in:
Bibliographic Details
Published inIEEE/ACM transactions on networking Vol. 24; no. 4; pp. 1948 - 1960
Main Authors Liu, Zhidan, Li, Zhenjiang, Li, Mo, Xing, Wei, Lu, Dongming
Format Journal Article
LanguageEnglish
Published New York IEEE 01.08.2016
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper presents CSPR, a compressive-sensing-based approach for path reconstruction in wireless sensor networks. By viewing the whole network as a path representation space, an arbitrary routing path can be represented by a path vector in the space. As path length is usually much smaller than the network size, such path vectors are sparse, i.e., the majority of elements are zeros. By encoding sparse path representation into packets, the path vector (and thus the represented routing path) can be recovered from a small amount of packets using compressive sensing technique. CSPR formalizes the sparse path representation and enables accurate and efficient per-packet path reconstruction. CSPR is invulnerable to network dynamics and lossy links due to its distinct design. A set of optimization techniques is further proposed to improve the design. We evaluate CSPR in both testbed-based experiments and large-scale trace-driven simulations. Evaluation results show that CSPR achieves high path recovery accuracy (i.e., 100% and 96% in experiments and simulations, respectively) and outperforms the state-of-the-art approaches in various network settings.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1063-6692
1558-2566
DOI:10.1109/TNET.2015.2435805