Abstract
Most improvements for Naive Bayes (NB) have a common yet important flaw - these algorithms split the modeling of the classifier into two separate stages - the stage of preprocessing (e.g., feature selection and data expansion) and the stage of building the NB classifier. The first stage does not take the NB's objective function into consideration, so the performance of the classification cannot be guaranteed. Motivated by these facts and aiming to improve NB with accurate classification, we present a new learning algorithm called Evolutionary Local Instance Weighted Naive Bayes or ELWNB, to extend NB for classification. ELWNB combines local NB, instance weighted dataset extension and evolutionary algorithms seamlessly. Experiments on 20 UCI benchmark datasets demonstrate that ELWNB significantly outperforms NB and several other improved NB algorithms.
Original language | English |
---|---|
Title of host publication | IJCNN 2016 |
Subtitle of host publication | Proceedings of the 2016 International Joint Conference on Neural Networks |
Place of Publication | Piscataway, NJ |
Publisher | Institute of Electrical and Electronics Engineers (IEEE) |
Pages | 3124-3129 |
Number of pages | 6 |
ISBN (Electronic) | 9781509006205, 9781509006199 |
ISBN (Print) | 9781509006212 |
DOIs | |
Publication status | Published - 31 Oct 2016 |
Externally published | Yes |
Event | 2016 International Joint Conference on Neural Networks, IJCNN 2016 - Vancouver, Canada Duration: 24 Jul 2016 → 29 Jul 2016 |
Conference
Conference | 2016 International Joint Conference on Neural Networks, IJCNN 2016 |
---|---|
Country/Territory | Canada |
City | Vancouver |
Period | 24/07/16 → 29/07/16 |