Abstract
Graph-based algorithms have drawn much attention thanks to their impressive success in semi-supervised setups. For better model performance, previous studies have learned to transform the topology of the input graph. However, these works only focus on optimizing the original nodes and edges, leaving the direction of augmenting existing data insufficiently explored. In this paper, we propose a novel heuristic pre-processing technique, namely Local Label Consistency Strengthening (ŁLCS), which automatically expands new nodes and edges to refine the label consistency within a dense subgraph. Our framework can effectively benefit downstream models by substantially enlarging the original training set with high-quality generated labeled data and refining the original graph topology. To justify the generality and practicality of ŁLCS, we couple it with the popular graph convolution network and graph attention network to perform extensive evaluations on three standard datasets. In all setups tested, our method boosts the average accuracy by a large margin of 4.7% and consistently outperforms the state-of-the-art.
Original language | English |
---|---|
Title of host publication | CIKM 2021 - Proceedings of the 30th ACM International Conference on Information and Knowledge Management |
Place of Publication | New York, NY |
Publisher | Association for Computing Machinery |
Pages | 3201-3205 |
Number of pages | 5 |
ISBN (Electronic) | 9781450384469 |
DOIs | |
Publication status | Published - 2021 |
Event | 30th ACM International Conference on Information and Knowledge Management, CIKM 2021 - Virtual, Online, Australia Duration: 1 Nov 2021 → 5 Nov 2021 |
Conference
Conference | 30th ACM International Conference on Information and Knowledge Management, CIKM 2021 |
---|---|
Country/Territory | Australia |
City | Virtual, Online |
Period | 1/11/21 → 5/11/21 |
Keywords
- node classification
- semi-supervised learning
- topology enhanced transformation