Agent Tuning & Optimization 相关度: 6/10

Grow, Assess, Compress: Adaptive Backbone Scaling for Memory-Efficient Class Incremental Learning

Adrian Garcia-Castañeda, Jon Irureta, Jon Imaz, Aizea Lojo
arXiv: 2603.08426v1 发布: 2026-03-09 更新: 2026-03-09

AI 摘要

提出GRACE框架,自适应调整模型容量,平衡可塑性和稳定性,提升增量学习效率。

主要贡献

  • 提出Grow, Assess, Compress (GRACE) 框架
  • 引入饱和度评估阶段,指导模型扩展或压缩
  • 显著降低内存占用,同时保持或提升性能

方法论

通过循环的扩展、评估和压缩步骤,动态调整模型结构,避免参数爆炸和灾难性遗忘。

原文摘要

Class Incremental Learning (CIL) poses a fundamental challenge: maintaining a balance between the plasticity required to learn new tasks and the stability needed to prevent catastrophic forgetting. While expansion-based methods effectively mitigate forgetting by adding task-specific parameters, they suffer from uncontrolled architectural growth and memory overhead. In this paper, we propose a novel dynamic scaling framework that adaptively manages model capacity through a cyclic "GRow, Assess, ComprEss" (GRACE) strategy. Crucially, we supplement backbone expansion with a novel saturation assessment phase that evaluates the utilization of the model's capacity. This assessment allows the framework to make informed decisions to either expand the architecture or compress the backbones into a streamlined representation, preventing parameter explosion. Experimental results demonstrate that our approach achieves state-of-the-art performance across multiple CIL benchmarks, while reducing memory footprint by up to a 73% compared to purely expansionist models.

标签

Class Incremental Learning Memory Efficiency Dynamic Scaling

arXiv 分类

cs.LG cs.CV