Siamese capsule networks with global and local features for text classification

Yujia Wu, Jing Li*, Jia Wu, Jun Chang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)

Abstract

Text classification is a popular research topic in the field of natural language processing and provides wide applications. The existing text classification methods based on deep neural networks can completely extract the local features of text. The text classification models constructed based on these methods yield good experimental results. However, these methods generally ignore the global semantic information of different categories of text and global spatial distance between categories. To some extent, this adversely affects the accuracy of classification. In this study, to address this problem, Siamese capsule networks with global and local features were proposed. A Siamese network was used to glean information about the global semantic differences between categories, which could more accurately represent the semantic distance between different categories. A global memory mechanism was established to store global semantic features, which were then incorporated into the text classification model. Capsule vectors were used to obtain the spatial position relationships of local features, thereby improving the representation capabilities of the features. The experimental results showed that the proposed model achieved better results and performed significantly better on six different public datasets, as compared with ten baseline algorithms.

Original languageEnglish
Pages (from-to)88-98
Number of pages11
JournalNeurocomputing
Volume390
DOIs
Publication statusPublished - 21 May 2020

Keywords

  • Capsule networks
  • Global and local features
  • Neural networks
  • Siamese networks
  • Text classification

Fingerprint Dive into the research topics of 'Siamese capsule networks with global and local features for text classification'. Together they form a unique fingerprint.

Cite this