Multimodal hyperbolic graph learning for Alzheimer’s disease detection

Chengyao Xie, Wenhao Zhou, Ciyuan Peng, Azadeh Noori Hoshyar, Chengpei Xu, Usman Naseem, Feng Xia

Research output: Working paperPreprint

Abstract

Multimodal graph learning techniques have demonstrated significant potential in modeling brain networks for Alzheimer’s disease (AD) detection. However, most existing methods rely on Euclidean space representations and overlook the scale-free and small-world properties of brain networks, which are characterized by power-law distributions and dense local clustering of nodes. This oversight results in distortions when representing these complex structures. To address this issue, we propose a novel multimodal Poincaré Fréchet mean graph convolutional network (MochaGCN) for AD detection. MochaGCN leverages the exponential growth characteristics of hyperbolic space to capture the scale-free and small-world properties of multimodal brain networks. Specifically, we combine hyperbolic graph convolution and Poincaré Fréchet mean to extract features from multimodal brain networks, enhancing their rep-resentations in hyperbolic space. Our approach constructs multimodal brain networks by integrating information from diffusion tensor imaging (DTI) and functional magnetic resonance imaging (fMRI) data. Experiments on the Alzheimer’s Disease Neuroimaging Initiative (ADNI) dataset demonstrate that the proposed method outperforms state-of-the-art techniques.
Original languageEnglish
DOIs
Publication statusSubmitted - 5 Nov 2024

Publication series

NamemedRxiv

Cite this