Combination of loss functions for deep text classification

Hamideh Hajiabadi, Diego Molla-Aliod, Reza Monsefi, Hadi Sadoghi Yazdi

Research output: Contribution to journalArticleResearchpeer-review

Abstract

Ensemble methods have shown to improve the results of statistical classifiers by combining multiple single learners into a strong one. In this paper, we explore the use of ensemble methods at the level of the objective function of a deep neural network. We propose a novel objective function that is a linear combination of single losses and integrate the proposed objective function into a deep neural network. By doing so, the weights associated with the linear combination of losses are learned by back propagation during the training stage. We study the impact of such an ensemble loss function on the state-of-the-art convolutional neural networks for text classification. We show the effectiveness of our approach through comprehensive experiments on text classification. The experimental results demonstrate a significant improvement compared with the conventional state-of-the-art methods in the literature.
LanguageEnglish
Number of pages11
JournalInternational Journal of Machine Learning and Cybernetics
DOIs
Publication statusE-pub ahead of print - 19 Jul 2019

Fingerprint

Backpropagation
Classifiers
Neural networks
Experiments
Deep neural networks

Keywords

  • Loss function
  • Convolutional neural network
  • Ensemble method
  • Multi-class classification
  • Multi-class classifier

Cite this

@article{0e1ee6e58f83400f89aa028f5a8a5d31,
title = "Combination of loss functions for deep text classification",
abstract = "Ensemble methods have shown to improve the results of statistical classifiers by combining multiple single learners into a strong one. In this paper, we explore the use of ensemble methods at the level of the objective function of a deep neural network. We propose a novel objective function that is a linear combination of single losses and integrate the proposed objective function into a deep neural network. By doing so, the weights associated with the linear combination of losses are learned by back propagation during the training stage. We study the impact of such an ensemble loss function on the state-of-the-art convolutional neural networks for text classification. We show the effectiveness of our approach through comprehensive experiments on text classification. The experimental results demonstrate a significant improvement compared with the conventional state-of-the-art methods in the literature.",
keywords = "Loss function, Convolutional neural network, Ensemble method, Multi-class classification, Multi-class classifier",
author = "Hamideh Hajiabadi and Diego Molla-Aliod and Reza Monsefi and Yazdi, {Hadi Sadoghi}",
year = "2019",
month = "7",
day = "19",
doi = "10.1007/s13042-019-00982-x",
language = "English",
journal = "International Journal of Machine Learning and Cybernetics",
issn = "1868-8071",
publisher = "Springer, Springer Nature",

}

Combination of loss functions for deep text classification. / Hajiabadi, Hamideh; Molla-Aliod, Diego; Monsefi, Reza; Yazdi, Hadi Sadoghi.

In: International Journal of Machine Learning and Cybernetics, 19.07.2019.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - Combination of loss functions for deep text classification

AU - Hajiabadi,Hamideh

AU - Molla-Aliod,Diego

AU - Monsefi,Reza

AU - Yazdi,Hadi Sadoghi

PY - 2019/7/19

Y1 - 2019/7/19

N2 - Ensemble methods have shown to improve the results of statistical classifiers by combining multiple single learners into a strong one. In this paper, we explore the use of ensemble methods at the level of the objective function of a deep neural network. We propose a novel objective function that is a linear combination of single losses and integrate the proposed objective function into a deep neural network. By doing so, the weights associated with the linear combination of losses are learned by back propagation during the training stage. We study the impact of such an ensemble loss function on the state-of-the-art convolutional neural networks for text classification. We show the effectiveness of our approach through comprehensive experiments on text classification. The experimental results demonstrate a significant improvement compared with the conventional state-of-the-art methods in the literature.

AB - Ensemble methods have shown to improve the results of statistical classifiers by combining multiple single learners into a strong one. In this paper, we explore the use of ensemble methods at the level of the objective function of a deep neural network. We propose a novel objective function that is a linear combination of single losses and integrate the proposed objective function into a deep neural network. By doing so, the weights associated with the linear combination of losses are learned by back propagation during the training stage. We study the impact of such an ensemble loss function on the state-of-the-art convolutional neural networks for text classification. We show the effectiveness of our approach through comprehensive experiments on text classification. The experimental results demonstrate a significant improvement compared with the conventional state-of-the-art methods in the literature.

KW - Loss function

KW - Convolutional neural network

KW - Ensemble method

KW - Multi-class classification

KW - Multi-class classifier

UR - http://www.scopus.com/inward/record.url?scp=85069510475&partnerID=8YFLogxK

U2 - 10.1007/s13042-019-00982-x

DO - 10.1007/s13042-019-00982-x

M3 - Article

JO - International Journal of Machine Learning and Cybernetics

T2 - International Journal of Machine Learning and Cybernetics

JF - International Journal of Machine Learning and Cybernetics

SN - 1868-8071

ER -