An elastic net-based hybrid hypothesis method for compressed video sensing

Compressed Sensing, an emerging framework for signal processing, can be used in image and video application, especially when available resources at the transmitter side are limited, such as Wireless Multimedia Sensor Networks. For a low-cost and low-power demand, we consider the plain compressive sa...

Full description

Saved in:
Bibliographic Details
Published inMultimedia tools and applications Vol. 74; no. 6; pp. 2085 - 2108
Main Authors Chen, Jian, Chen, Yunzheng, Qin, Dong, Kuo, Yonghong
Format Journal Article
LanguageEnglish
Published Boston Springer US 01.03.2015
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Compressed Sensing, an emerging framework for signal processing, can be used in image and video application, especially when available resources at the transmitter side are limited, such as Wireless Multimedia Sensor Networks. For a low-cost and low-power demand, we consider the plain compressive sampling and low sampling rates and propose a Compressed Video Sensing scheme. As a result, most burden of video processing can be shifted to the decoder which employs a hybrid hypothesis prediction method in reconstruction. The Elastic net-based multi-hypothesis mode, one part of the prediction method, combines the multi-hypothesis prediction and the elastic net regression together. And in the process of decoding, either this mode or the single-hypothesis one is implemented based on the threshold which is selected from [1e-11, 1). Both of the prediction modes are carried out in the measurement domain and a residual reconstruction as the final step is executed to accomplish the recovery. According to the performance presented by the simulation results, the proposed multi-hypothesis mode provides a better reconstruction quality than the other multi-hypothesis ones and the proposed scheme outperforms the observed state-of-the-art schemes for compressed-sensing video reconstruction at low sampling rates.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-013-1743-y