Optimal linear combination of neural networks for improving classification performance

This paper presents a new method for linearly combining multiple neural network classifiers based on the statistical pattern recognition theory. In our approach, several neural networks are first selected based on which works best for each class in terms of minimizing classification errors. Then, th...

Full description

Saved in:
Bibliographic Details
Published inIEEE Trans. Pattern Analysis and Machine Intelligence Vol. 22; no. 2; pp. 207 - 215
Main Author Ueda, N.
Format Journal Article
LanguageEnglish
Published New York IEEE 01.02.2000
Institute of Electrical and Electronics Engineers (IEEE)
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN0162-8828
1939-3539
DOI10.1109/34.825759

Cover

More Information
Summary:This paper presents a new method for linearly combining multiple neural network classifiers based on the statistical pattern recognition theory. In our approach, several neural networks are first selected based on which works best for each class in terms of minimizing classification errors. Then, they are linearly combined to form an ideal classifier that exploits the strengths of the individual classifiers. In this approach, the minimum classification error criterion is utilized to estimate the optimal linear weights. In this formulation, because the classification decision rule is incorporated into the cost function, a more suitable better combination of weights for the classification objective could be obtained. Experimental results using artificial and real data sets show that the proposed method can construct a better combined classifier that outperforms the best single classifier in terms of overall classification errors for test data.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
content type line 23
ISSN:0162-8828
1939-3539
DOI:10.1109/34.825759