Evolutionary lazy learning for Naive Bayes classification

Yu Bai, Haishuai Wang, Jia Wu, Yun Zhang, Jing Jiang, Guodong Long

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

7 Citations (Scopus)

Abstract

Most improvements for Naive Bayes (NB) have a common yet important flaw - these algorithms split the modeling of the classifier into two separate stages - the stage of preprocessing (e.g., feature selection and data expansion) and the stage of building the NB classifier. The first stage does not take the NB's objective function into consideration, so the performance of the classification cannot be guaranteed. Motivated by these facts and aiming to improve NB with accurate classification, we present a new learning algorithm called Evolutionary Local Instance Weighted Naive Bayes or ELWNB, to extend NB for classification. ELWNB combines local NB, instance weighted dataset extension and evolutionary algorithms seamlessly. Experiments on 20 UCI benchmark datasets demonstrate that ELWNB significantly outperforms NB and several other improved NB algorithms.

Original languageEnglish
Title of host publicationIJCNN 2016
Subtitle of host publicationProceedings of the 2016 International Joint Conference on Neural Networks
Place of PublicationPiscataway, NJ
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages3124-3129
Number of pages6
ISBN (Electronic)9781509006205, 9781509006199
ISBN (Print)9781509006212
DOIs
Publication statusPublished - 31 Oct 2016
Externally publishedYes
Event2016 International Joint Conference on Neural Networks, IJCNN 2016 - Vancouver, Canada
Duration: 24 Jul 201629 Jul 2016

Conference

Conference2016 International Joint Conference on Neural Networks, IJCNN 2016
Country/TerritoryCanada
CityVancouver
Period24/07/1629/07/16

Fingerprint

Dive into the research topics of 'Evolutionary lazy learning for Naive Bayes classification'. Together they form a unique fingerprint.

Cite this