Hybrid dynamic k-nearest-neighbour and distance and attribute weighted method for classification

Jia Wu, Zhihua Cai, Shuang Ao

Research output: Contribution to journalArticlepeer-review

42 Citations (Scopus)


K–nearest–neighbour (KNN) as an important classification method has been widely used in data mining. However, the class probability estimation, the neighbourhood size and the type of distance function confronting KNN may affect its classification accuracy. Many researchers have been focused on improving the accuracy of KNN via distance weighted, attribute weighted, and dynamic selected methods etc. In this paper, we firstly reviewed some improved algorithms of KNN in three categories mentioned above. Then, we singled out an improved algorithm called dynamic KNN with distance and attribute weighted, simply DKNDAW. We experimentally tested our new algorithm in Weka system. In our experiment, we compared it to KNN, WAKNN, KNNDW, KNNDAW, and DKNN. The experimental results show that DKNDAW significantly outperforms other algorithms in terms of the classification accuracy. Besides, how to learn a weighted DKNDAW with accurate ranking from data, or more precisely, different attribute weighted method of DKNDAW can produce accurate ranking. We explore various methods: the gain ratio method, the correlation–based feature selection method, and the decision tree–based method. We concluded that the gain ratio method is more suitable for our improved KNN algorithm DKNDAW.
Original languageEnglish
Pages (from-to)378-384
Number of pages7
JournalInternational Journal of Computer Applications in Technology
Issue number4
Publication statusPublished - 1 Jun 2012
Externally publishedYes


  • dynamic KNN
  • k–nearest–neighbour
  • distance weighted
  • attribute weighted
  • decision tree
  • classification accuracy
  • gain ratio
  • feature selection


Dive into the research topics of 'Hybrid dynamic k-nearest-neighbour and distance and attribute weighted method for classification'. Together they form a unique fingerprint.

Cite this