Abstract
Graph meta-learning has recently received significantly increased attention by virtue of its potential to extract common and transferable knowledge from learning different tasks on a graph. Existing methods for graph meta-learning usually leverage local subgraphs to transfer subgraph-specific information. However, they inherently face the challenge of imbalanced subgraphs due to inconsistent node density and different label distributions over local subgraphs. This paper proposes an adaptive graph meta-learning framework (AG-Meta) for learning the consistent and transferable representation of a graph in a way that can adapt to imbalanced subgraphs. Specifically, AG-Meta first learns the structural representation of subgraphs with various degrees using an Adaptive Graph Cascade Diffusion Network (AGCDN). AG-Meta then employs a prototype-consistency classifier to produce more accurate transferable inductive representations (also called prototypes) under few-shot settings with different label distributions of a subgraph. In the context of optimizing a model-agnostic meta-learner, a novel metric loss is finally introduced to achieve structural representation and prototype consistency. Extensive experiments are conducted to compare AG-Meta against baselines on five real-world networks, which validates that AG-Meta outperforms the state-of-the-art approaches.
Original language | English |
---|---|
Article number | 110387 |
Pages (from-to) | 1-12 |
Number of pages | 12 |
Journal | Pattern Recognition |
Volume | 151 |
DOIs | |
Publication status | Published - Jul 2024 |
Keywords
- Embedding representation
- Local subgraphs
- Few-shot graph learning
- Meta-learning