TY - JOUR
T1 - Gradient boosted neural decision forest
AU - Dong, Manqing
AU - Yao, Lina
AU - Wang, Xianzhi
AU - Benatallah, Boualem
AU - Zhang, Shuai
AU - Sheng, Quan Z.
PY - 2023
Y1 - 2023
N2 - Tree-based models and deep neural networks are two schools of effective classification methods in machine learning. While tree-based models are robust irrespective of data domain, deep neural networks have advantages in handling high-dimensional data. Adding a differentiable neural decision forest to the neural network can generally help exploit the benefits of both models. Therefore, traditional decision trees diverge into a bagging version (i.e., random forest) and a boosting version (i.e., gradient boost decision tree). In this work, we aim to harness the advantages of both bagging and boosting by applying gradient boost to a neural decision forest. We propose a gradient boost that can learn the residual using neural decision forest, considering the residual as a part for the final prediction. Besides, we design a structure for learning the parameters of neural decision forest and gradient boost module in contiguous steps, which is extendable to incorporate multiple gradient-boosting modules in an end-to-end manner. Our extensive experiments on several public datasets demonstrate the competitive performance and efficiency of our model against a series of baseline methods in solving various machine learning tasks.
AB - Tree-based models and deep neural networks are two schools of effective classification methods in machine learning. While tree-based models are robust irrespective of data domain, deep neural networks have advantages in handling high-dimensional data. Adding a differentiable neural decision forest to the neural network can generally help exploit the benefits of both models. Therefore, traditional decision trees diverge into a bagging version (i.e., random forest) and a boosting version (i.e., gradient boost decision tree). In this work, we aim to harness the advantages of both bagging and boosting by applying gradient boost to a neural decision forest. We propose a gradient boost that can learn the residual using neural decision forest, considering the residual as a part for the final prediction. Besides, we design a structure for learning the parameters of neural decision forest and gradient boost module in contiguous steps, which is extendable to incorporate multiple gradient-boosting modules in an end-to-end manner. Our extensive experiments on several public datasets demonstrate the competitive performance and efficiency of our model against a series of baseline methods in solving various machine learning tasks.
UR - http://www.scopus.com/inward/record.url?scp=85121337745&partnerID=8YFLogxK
U2 - 10.1109/TSC.2021.3133673
DO - 10.1109/TSC.2021.3133673
M3 - Article
AN - SCOPUS:85121337745
SN - 1939-1374
VL - 16
SP - 330
EP - 342
JO - IEEE Transactions on Services Computing
JF - IEEE Transactions on Services Computing
IS - 1
ER -