An Efficient Sparse Bayesian Learning Algorithm Based on Gaussian-Scale Mixtures
Sparse Bayesian learning (SBL) is a popular machine learning approach with a superior generalization capability due to the sparsity of its adopted model. However, it entails a matrix inversion at each iteration, hindering its practical applications with large-scale data sets. To overcome this bottle...
Saved in:
Published in | IEEE transaction on neural networks and learning systems Vol. PP; no. 7; pp. 1 - 14 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.07.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Sparse Bayesian learning (SBL) is a popular machine learning approach with a superior generalization capability due to the sparsity of its adopted model. However, it entails a matrix inversion at each iteration, hindering its practical applications with large-scale data sets. To overcome this bottleneck, we propose an efficient SBL algorithm with O(n²) computational complexity per iteration based on a Gaussian-scale mixture prior model. By specifying two different hyperpriors, the proposed efficient SBL algorithm can meet two different requirements, such as high efficiency and high sparsity. A surrogate function is introduced herein to approximate the posterior density of model parameters and thereby to avoid matrix inversions. Using a data-dependent term, a joint cost function with separate penalty terms is reformulated in a joint space of model parameters and hyperparameters. The resulting nonconvex optimization problem is solved using a block coordinate descent method in a majorization-minimization framework. Finally, the results of extensive experiments for sparse signal recovery and sparse image reconstruction on benchmark problems are elaborated to substantiate the effectiveness and superiority of the proposed approach in terms of computational time and estimation error. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 2162-237X 2162-2388 |
DOI: | 10.1109/TNNLS.2020.3049056 |