A New Discriminative Sparse Representation Method for Robust Face Recognition via l Regularization
Sparse representation has shown an attractive performance in a number of applications. However, the available sparse representation methods still suffer from some problems, and it is necessary to design more efficient methods. Particularly, to design a computationally inexpensive, easily solvable, a...
Saved in:
Published in | IEEE transaction on neural networks and learning systems Vol. 28; no. 10; pp. 2233 - 2242 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
01.10.2017
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Sparse representation has shown an attractive performance in a number of applications. However, the available sparse representation methods still suffer from some problems, and it is necessary to design more efficient methods. Particularly, to design a computationally inexpensive, easily solvable, and robust sparse representation method is a significant task. In this paper, we explore the issue of designing the simple, robust, and powerfully efficient sparse representation methods for image classification. The contributions of this paper are as follows. First, a novel discriminative sparse representation method is proposed and its noticeable performance in image classification is demonstrated by the experimental results. More importantly, the proposed method outperforms the existing state-of-the-art sparse representation methods. Second, the proposed method is not only very computationally efficient but also has an intuitive and easily understandable idea. It exploits a simple algorithm to obtain a closed-form solution and discriminative representation of the test sample. Third, the feasibility, computational efficiency, and remarkable classification accuracy of the proposed l 2 regularization-based representation are comprehensively shown by extensive experiments and analysis. The code of the proposed method is available at http://www.yongxu.org/lunwen.html. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 2162-237X 2162-2388 |
DOI: | 10.1109/TNNLS.2016.2580572 |