Attribute weighting

how and when does it work for Bayesian Network Classification

Jia Wu, Zhihua Cai, Shirui Pan, Xingquan Zhu, Chengqi Zhang

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contribution

5 Citations (Scopus)

Abstract

A Bayesian Network (BN) is a graphical model which can be used to represent conditional dependency between random variables, such as diseases and symptoms. A Bayesian Network Classifier (BNC) uses BN to characterize the relationships between attributes and the class labels, where a simplified approach is to employ a conditional independence assumption between attributes and the corresponding class labels, i.e., the Naive Bayes (NB) classification model. One major approach to mitigate NB's primary weakness (the conditional independence assumption) is the attribute weighting, and this type of approach has been proved to be effective for NB with simple structure. However, for weighted BNCs involving complex structures, in which attribute weighting is embedded into the model, there is no existing study on whether the weighting will work for complex BNCs and how effective it will impact on the learning of a given task. In this paper, we first survey several complex structure models for BNCs, and then carry out experimental studies to investigate the effectiveness of the attribute weighting strategies for complex BNCs, with a focus on Hidden Naive Bayes (HNB) and Averaged One-Dependence Estimation (AODE). Our studies use classification accuracy (ACC), area under the ROC curve ranking (AUC), and conditional log likelihood (CLL), as the performance metrics. Experiments and comparisons on 36 benchmark data sets demonstrate that attribute weighting technologies just slightly outperforms unweighted complex BNCs with respect to the ACC and AUC, but significant improvement can be observed using CLL.
Original languageEnglish
Title of host publication2014 International Joint Conference on Neural Networks (IJCNN)
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages4076-4083
Number of pages8
ISBN (Electronic)9781479914845
DOIs
Publication statusPublished - 1 Jul 2014
Externally publishedYes
Event2014 International Joint Conference on Neural Networks, IJCNN 2014 - Beijing, China
Duration: 6 Jul 201411 Jul 2014

Conference

Conference2014 International Joint Conference on Neural Networks, IJCNN 2014
CountryChina
CityBeijing
Period6/07/1411/07/14

Keywords

  • Bayes methods
  • estimation theory
  • pattern classification
  • CLL
  • conditional log likelihood
  • ROC curve ranking
  • AODE
  • averaged one-dependence estimation
  • HNB
  • hidden naive Bayes
  • naive Bayes classification
  • BNC
  • graphical model
  • Bayesian network classification
  • attribute weighting
  • Niobium
  • Training
  • Mutual information
  • Estimation
  • Educational institutions
  • Annealing

Fingerprint Dive into the research topics of 'Attribute weighting: how and when does it work for Bayesian Network Classification'. Together they form a unique fingerprint.

  • Cite this

    Wu, J., Cai, Z., Pan, S., Zhu, X., & Zhang, C. (2014). Attribute weighting: how and when does it work for Bayesian Network Classification. In 2014 International Joint Conference on Neural Networks (IJCNN) (pp. 4076-4083). Institute of Electrical and Electronics Engineers (IEEE). https://doi.org/10.1109/IJCNN.2014.6889536