A tight bound of modified iterative hard thresholding algorithm for compressed sensing

We provide a theoretical study of the iterative hard thresholding with partially known support set (IHT-PKS) algorithm when used to solve the compressed sensing recovery problem. Recent work has shown that IHT-PKS performs better than the traditional IHT in reconstructing sparse or compressible sign...

Full description

Saved in:
Bibliographic Details
Published inApplications of mathematics (Prague) Vol. 68; no. 5; pp. 623 - 642
Main Authors Ma, Jinyao, Zhang, Haibin, Yang, Shanshan, Jiang, Jiaojiao
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.10.2023
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We provide a theoretical study of the iterative hard thresholding with partially known support set (IHT-PKS) algorithm when used to solve the compressed sensing recovery problem. Recent work has shown that IHT-PKS performs better than the traditional IHT in reconstructing sparse or compressible signals. However, less work has been done on analyzing the performance guarantees of IHT-PKS. In this paper, we improve the current RIP-based bound of IHT-PKS algorithm from δ 3 s − 2 k < 1 32 ≈ 0.1768 to δ 3 s − 2 k < 5 − 1 4 , where δ 3 s −2 k is the restricted isometric constant of the measurement matrix. We also present the conditions for stable reconstruction using the IHT μ -PKS algorithm which is a general form of IHT-PKS. We further apply the algorithm on Least Squares Support Vector Machines (LS-SVM), which is one of the most popular tools for regression and classification learning but confronts the loss of sparsity problem. After the sparse representation of LS-SVM is presented by compressed sensing, we exploit the support of bias term in the LS-SVM model with the IHT μ -PKS algorithm. Experimental results on classification problems show that IHT μ -PKS outperforms other approaches to computing the sparse LS-SVM classifier.
ISSN:0862-7940
1572-9109
DOI:10.21136/AM.2023.0221-22