LLM Reasoning 相关度: 5/10

Recurrent Graph Neural Networks and Arithmetic Circuits

Timon Barlag, Vivian Holzapfel, Laura Strieker, Jonni Virtema, Heribert Vollmer
arXiv: 2603.05140v1 发布: 2026-03-05 更新: 2026-03-05

AI 摘要

论文建立了循环图神经网络和循环算术电路在计算能力上的精确对应关系。

主要贡献

  • 提出了循环算术电路的概念
  • 证明了循环图神经网络可以模拟循环算术电路
  • 证明了循环算术电路可以模拟循环图神经网络

方法论

论文通过构造性的证明方法,分别展示了两种模型之间可以互相模拟,从而建立了等价性。

原文摘要

We characterise the computational power of recurrent graph neural networks (GNNs) in terms of arithmetic circuits over the real numbers. Our networks are not restricted to aggregate-combine GNNs or other particular types. Generalizing similar notions from the literature, we introduce the model of recurrent arithmetic circuits, which can be seen as arithmetic analogues of sequential or logical circuits. These circuits utilise so-called memory gates which are used to store data between iterations of the recurrent circuit. While (recurrent) GNNs work on labelled graphs, we construct arithmetic circuits that obtain encoded labelled graphs as real valued tuples and then compute the same function. For the other direction we construct recurrent GNNs which are able to simulate the computations of recurrent circuits. These GNNs are given the circuit-input as initial feature vectors and then, after the GNN-computation, have the circuit-output among the feature vectors of its nodes. In this way we establish an exact correspondence between the expressivity of recurrent GNNs and recurrent arithmetic circuits operating over real numbers.

标签

图神经网络 计算复杂性 算术电路 GNN

arXiv 分类

cs.CC cs.AI cs.LG