TY - GEN
T1 - NLGT
T2 - 39th Annual AAAI Conference on Artificial Intelligence, AAAI 2025
AU - Xu, Xiaolong
AU - Zhou, Yibo
AU - Xiang, Haolong
AU - Li, Xiaoyong
AU - Zhang, Xuyun
AU - Qi, Lianyong
AU - Dou, Wanchun
PY - 2025/4/11
Y1 - 2025/4/11
N2 - Graph Neural Networks (GNNs) are widely applied on graph-level tasks, such as node classification, link prediction and graph generation. Existing GNNs mostly adopt a message-passing mechanism to aggregate node information with their neighbors, which often makes node information similar after rounds of aggregations and leads to oversmoothing. Although recent works have made improvements by combining different message aggregation methods or introducing semantic encodings as priors, these message-passing based GNNs still fail to combat oversmoothing after multiple iterations of node aggregation. Besides, the feature extraction ability of these methods is restricted because of the graph sparsity that hinders the aggregation of node information. To deal with the above two issues, we propose Neighborhood-based and Label-enhanced Graph Transformer (NLGT), a novel and effective framework for graph learning. Specifically, we present a label-enhanced feature fusion mechanism that integrate the shallow node features and label embeddings as enhanced features. Moreover, we design a neighborhood-based mask attention mechanism to alleviate the negative effects caused by the sparsity of the graph. In the predicting stage, we aggregate the prediction results from multiple sampled sub-graphs and apply voting mechanisms to enhance the accuracy and robustness of our framework. Finally, extensive experiments are conducted on four open benchmark datasets, which demonstrate the effectiveness and robustness of our proposed framework compared with existing state-of-the-art methods.
AB - Graph Neural Networks (GNNs) are widely applied on graph-level tasks, such as node classification, link prediction and graph generation. Existing GNNs mostly adopt a message-passing mechanism to aggregate node information with their neighbors, which often makes node information similar after rounds of aggregations and leads to oversmoothing. Although recent works have made improvements by combining different message aggregation methods or introducing semantic encodings as priors, these message-passing based GNNs still fail to combat oversmoothing after multiple iterations of node aggregation. Besides, the feature extraction ability of these methods is restricted because of the graph sparsity that hinders the aggregation of node information. To deal with the above two issues, we propose Neighborhood-based and Label-enhanced Graph Transformer (NLGT), a novel and effective framework for graph learning. Specifically, we present a label-enhanced feature fusion mechanism that integrate the shallow node features and label embeddings as enhanced features. Moreover, we design a neighborhood-based mask attention mechanism to alleviate the negative effects caused by the sparsity of the graph. In the predicting stage, we aggregate the prediction results from multiple sampled sub-graphs and apply voting mechanisms to enhance the accuracy and robustness of our framework. Finally, extensive experiments are conducted on four open benchmark datasets, which demonstrate the effectiveness and robustness of our proposed framework compared with existing state-of-the-art methods.
UR - http://www.scopus.com/inward/record.url?scp=105003909910&partnerID=8YFLogxK
U2 - 10.1609/aaai.v39i12.33413
DO - 10.1609/aaai.v39i12.33413
M3 - Conference proceeding contribution
AN - SCOPUS:105003909910
T3 - Proceedings of the AAAI Conference on Artificial Intelligence
SP - 12954
EP - 12962
BT - AAAI-25: Proceedings of the 39th Annual AAAI Conference on Artificial Intelligence
A2 - Walsh, Toby
A2 - Shah, Julie
A2 - Kolter, Zico
PB - Association for the Advancement of Artificial Intelligence
CY - Washington, DC
Y2 - 25 February 2025 through 4 March 2025
ER -