Sparse signal recovery via generalized gaussian function
In this paper, we replace the ℓ 0 norm with the variation of generalized Gaussian function Φ α ( x ) in sparse signal recovery. We firstly show that Φ α ( x ) is a type of non-convex sparsity-promoting function and clearly demonstrate the equivalence among the three minimization models ( P 0 ) : min...
Saved in:
Published in | Journal of global optimization Vol. 83; no. 4; pp. 783 - 801 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.08.2022
Springer Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In this paper, we replace the
ℓ
0
norm with the variation of generalized Gaussian function
Φ
α
(
x
)
in sparse signal recovery. We firstly show that
Φ
α
(
x
)
is a type of non-convex sparsity-promoting function and clearly demonstrate the equivalence among the three minimization models
(
P
0
)
:
min
x
∈
R
n
‖
x
‖
0
subject to
A
x
=
b
,
(
E
α
)
:
min
x
∈
R
n
Φ
α
(
x
)
subject to
A
x
=
b
and
(
E
α
λ
)
:
min
x
∈
R
n
1
2
‖
A
x
-
b
‖
2
2
+
λ
Φ
α
(
x
)
.
The established equivalent theorems elaborate that
(
P
0
)
can be completely overcome by solving the continuous minimization
(
E
α
)
for some
α
s, while the latter is computable by solving the regularized minimization
(
E
α
λ
)
under certain conditions. Secondly, based on DC algorithm and iterative soft thresholding algorithm, a successful algorithm for the regularization minimization
(
E
α
λ
)
, called the DCS algorithm, is given. Finally, plenty of simulations are conducted to compare this algorithm with two classical algorithms which are half algorithm and soft algorithm, and the experiment results show that the DCS algorithm performs well in sparse signal recovery. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0925-5001 1573-2916 |
DOI: | 10.1007/s10898-022-01126-2 |