LLM Reasoning 相关度: 7/10

Transport and Merge: Cross-Architecture Merging for Large Language Models

Chenhang Cui, Binyun Yang, Fei Shen, Yuxin Chen, Jingnan Zheng, Xiang Wang, An Zhang, Tat-Seng Chua
arXiv: 2602.05495v1 发布: 2026-02-05 更新: 2026-02-05

AI 摘要

提出了基于最优传输的跨架构模型融合框架,实现大模型知识向小模型的有效迁移。

主要贡献

  • 提出了一种基于最优传输的跨架构模型融合方法
  • 实现了大模型到异构小模型的知识迁移
  • 在低资源语言和特定领域验证了方法的有效性

方法论

利用最优传输对齐异构模型的激活,推断神经元对应关系,指导权重空间融合。

原文摘要

Large language models (LLMs) achieve strong capabilities by scaling model capacity and training data, yet many real-world deployments rely on smaller models trained or adapted from low-resource data. This gap motivates the need for mechanisms to transfer knowledge from large, high-resource models to smaller, low-resource targets. While model merging provides an effective transfer mechanism, most existing approaches assume architecture-compatible models and therefore cannot directly transfer knowledge from large high-resource LLMs to heterogeneous low-resource targets. In this work, we propose a cross-architecture merging framework based on optimal transport (OT) that aligns activations to infer cross-neuron correspondences between heterogeneous models. The resulting transport plans are then used to guide direct weight-space fusion, enabling effective high-resource to low-resource transfer using only a small set of inputs. Extensive experiments across low-resource languages and specialized domains demonstrate consistent improvements over target models.

标签

模型融合 知识迁移 最优传输 跨架构 大语言模型

arXiv 分类

cs.CL cs.AI