AI Agents 相关度: 8/10

Diff-KD: Diffusion-based Knowledge Distillation for Collaborative Perception under Corruptions

Pengcheng Lyu, Chaokun Zhang, Gong Chen, Tao Tang, Zhaoxiang Luo
arXiv: 2604.02061v1 发布: 2026-04-02 更新: 2026-04-02

AI 摘要

Diff-KD通过扩散模型和知识蒸馏,提升协作感知在数据损坏下的鲁棒性。

主要贡献

  • 提出Diff-KD框架,结合扩散模型和知识蒸馏
  • 设计Progressive Knowledge Distillation (PKD)模块,利用扩散模型恢复全局语义
  • 提出Adaptive Gated Fusion (AGF)模块,根据可靠性动态融合邻居信息

方法论

使用扩散模型进行特征修复,结合知识蒸馏和自适应融合提高鲁棒性。

原文摘要

Multi-agent collaborative perception enables autonomous systems to overcome individual sensing limits through collective intelligence. However, real-world sensor and communication corruptions severely undermine this advantage. Crucially, existing approaches treat corruptions as static perturbations or passively conform to corrupted inputs, failing to actively recover the underlying clean semantics. To address this limitation, we introduce Diff-KD, a framework that integrates diffusion-based generative refinement into teacher-student knowledge distillation for robust collaborative perception. Diff-KD features two core components: (i) Progressive Knowledge Distillation (PKD), which treats local feature restoration as a conditional diffusion process to recover global semantics from corrupted observations; and (ii) Adaptive Gated Fusion (AGF), which dynamically weights neighbors based on ego reliability during fusion. Evaluated on OPV2V and DAIR-V2X under seven corruption types, Diff-KD achieves state-of-the-art performance in both detection accuracy and calibration robustness.

标签

协作感知 知识蒸馏 扩散模型 鲁棒性

arXiv 分类

cs.AI