Group representation-based classification
Conventional representation-based classification algorithms exploit the principle of classical representation which firstly calculates a representation formula of test samples by linearly combining training samples and then classify test samples by the distinction between the expression results of e...
Saved in:
Published in | 2015 12th International Conference on Fuzzy Systems and Knowledge Discovery (FSKD) pp. 768 - 772 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.08.2015
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Conventional representation-based classification algorithms exploit the principle of classical representation which firstly calculates a representation formula of test samples by linearly combining training samples and then classify test samples by the distinction between the expression results of each class and test samples. However, this distinction cannot always exactly display the deviation between the subject and the class, especially when the database has enormous samples. In the paper, a novel representation-based classification method named Group representation-based classification (GRC) is proposed. This method divide the data of the face database into several groups for enhancing the gap between the amount of data and the dimensionality of images, which can improve raise the accurate classify rate. Step one is to divide training samples into several groups. Then test sample is recognised in collaborative representation classification (CRC) [16] in each group. Finally, a fusion factor is exploited to fuse the result of each group, and derive the ultimate consequence. The paper introduces the essential bases and elements of this face recognition technique. The experiments proved that this proposed method can obtain higher accuracy and outperform the collaborative representation classification and some naive linear regression classification. |
---|---|
DOI: | 10.1109/FSKD.2015.7382039 |