Combination of loss functions for deep text classification

Hamideh Hajiabadi, Diego Molla-Aliod, Reza Monsefi, Hadi Sadoghi Yazdi

Research output: Contribution to journalArticlepeer-review

13 Citations (Scopus)
125 Downloads (Pure)

Abstract

Ensemble methods have shown to improve the results of statistical classifiers by combining multiple single learners into a strong one. In this paper, we explore the use of ensemble methods at the level of the objective function of a deep neural network. We propose a novel objective function that is a linear combination of single losses and integrate the proposed objective function into a deep neural network. By doing so, the weights associated with the linear combination of losses are learned by back propagation during the training stage. We study the impact of such an ensemble loss function on the state-of-the-art convolutional neural networks for text classification. We show the effectiveness of our approach through comprehensive experiments on text classification. The experimental results demonstrate a significant improvement compared with the conventional state-of-the-art methods in the literature.
Original languageEnglish
Pages (from-to)751-761
Number of pages11
JournalInternational Journal of Machine Learning and Cybernetics
Volume11
Issue number4
Early online date19 Jul 2019
DOIs
Publication statusPublished - Apr 2020

Keywords

  • Loss function
  • Convolutional neural network
  • Ensemble method
  • Multi-class classification
  • Multi-class classifier

Fingerprint

Dive into the research topics of 'Combination of loss functions for deep text classification'. Together they form a unique fingerprint.

Cite this