Abstract
Ensemble methods have shown to improve the results of statistical classifiers by combining multiple single learners into a strong one. In this paper, we explore the use of ensemble methods at the level of the objective function of a deep neural network. We propose a novel objective function that is a linear combination of single losses and integrate the proposed objective function into a deep neural network. By doing so, the weights associated with the linear combination of losses are learned by back propagation during the training stage. We study the impact of such an ensemble loss function on the state-of-the-art convolutional neural networks for text classification. We show the effectiveness of our approach through comprehensive experiments on text classification. The experimental results demonstrate a significant improvement compared with the conventional state-of-the-art methods in the literature.
Original language | English |
---|---|
Pages (from-to) | 751-761 |
Number of pages | 11 |
Journal | International Journal of Machine Learning and Cybernetics |
Volume | 11 |
Issue number | 4 |
Early online date | 19 Jul 2019 |
DOIs | |
Publication status | Published - Apr 2020 |
Keywords
- Loss function
- Convolutional neural network
- Ensemble method
- Multi-class classification
- Multi-class classifier