Classifier fusion in the Dempster–Shafer framework using optimized t-norm based combination rules

When combining classifiers in the Dempster–Shafer framework, Dempster’s rule is generally used. However, this rule assumes the classifiers to be independent. This paper investigates the use of other operators for combining non independent classifiers, including the cautious rule and, more generally,...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of approximate reasoning Vol. 52; no. 3; pp. 353 - 374
Main Authors Quost, Benjamin, Masson, Marie-Hélène, Denœux, Thierry
Format Journal Article
LanguageEnglish
Published Elsevier Inc 01.03.2011
Elsevier
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:When combining classifiers in the Dempster–Shafer framework, Dempster’s rule is generally used. However, this rule assumes the classifiers to be independent. This paper investigates the use of other operators for combining non independent classifiers, including the cautious rule and, more generally, t-norm based rules with behavior ranging between Dempster’s rule and the cautious rule. Two strategies are investigated for learning an optimal combination scheme, based on a parameterized family of t-norms. The first one learns a single rule by minimizing an error criterion. The second strategy is a two-step procedure, in which groups of classifiers with similar outputs are first identified using a clustering algorithm. Then, within- and between-cluster rules are determined by minimizing an error criterion. Experiments with various synthetic and real data sets demonstrate the effectiveness of both the single rule and two-step strategies. Overall, optimizing a single t-norm based rule yields better results than using a fixed rule, including Dempster’s rule, and the two-step strategy brings further improvements.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0888-613X
1873-4731
DOI:10.1016/j.ijar.2010.11.008