Learning Structural-Functional Brain Representations through Multi-Scale Adaptive Graph Attention for Cognitive Insight
AI 摘要
提出MAGNet,利用多尺度自适应图注意力网络融合结构和功能脑连接,提升认知功能预测。
主要贡献
- 提出MAGNet框架,融合结构和功能脑连接
- 利用Transformer-style图神经网络学习结构-功能交互
- 在ABCD数据集上验证了有效性
方法论
构建混合图整合直接和间接通路,利用局部-全局注意力优化连接重要性,联合损失函数优化。
原文摘要
Understanding how brain structure and function interact is key to explaining intelligence yet modeling them jointly is challenging as the structural and functional connectome capture complementary aspects of organization. We introduced Multi-scale Adaptive Graph Network (MAGNet), a Transformer-style graph neural network framework that adaptively learns structure-function interactions. MAGNet leverages source-based morphometry from structural MRI to extract inter-regional morphological features and fuses them with functional network connectivity from resting-state fMRI. A hybrid graph integrates direct and indirect pathways, while local-global attention refines connectivity importance and a joint loss simultaneously enforces cross-modal coherence and optimizes the prediction objective end-to-end. On the ABCD dataset, MAGNet outperformed relevant baselines, demonstrating effective multimodal integration for advancing our understanding of cognitive function.