Abstract
Naïve Bayes (NB) is a probability-based classification model based on the conditional independence assumption. However, in many real-world applications, this assumption is often violated. Responding to this fact, superparent-one-dependence estimators (SPODEs) weaken the attribute independence assumption by using each attribute of the database as the superparent. Aggregating one-dependence estimators (AODEs), which estimates the corresponding parameters for every SPODE, has been proved to be one of the most efficient models due to its high accuracy among those improvements for NB classifier. This paper investigates a novel approach to ensemble the single SPODE based on the boosting strategy, Boosting for superparent-one-dependence estimators, simply, BODE. BODE first endows every instance a weight, and then find an optimal SPODE with highest accuracy in each iteration as a weak classifier. By doing so, BODE boosts all the selected weak classifiers to do the classification in the test processing. Experiments on UCI datasets demonstrate the algorithm performance.
Original language | English |
---|---|
Pages (from-to) | 277-286 |
Number of pages | 10 |
Journal | International Journal of Computing Science and Mathematics |
Volume | 4 |
Issue number | 3 |
DOIs | |
Publication status | Published - 2013 |
Externally published | Yes |