Don't Look Back in Anger: MAGIC Net for Streaming Continual Learning with Temporal Dependence
AI 摘要
MAGIC Net通过结合CL和RNN解决数据流中的概念漂移、时间依赖和灾难性遗忘问题。
主要贡献
- 提出了MAGIC Net,一种新颖的SCL方法
- 结合了CL架构策略和循环神经网络
- 通过可学习的mask和架构扩展来解决遗忘问题
方法论
MAGIC Net使用可学习的mask回顾过去的知识,并在必要时扩展架构,实现持续学习和在线推理。
原文摘要
Concept drift, temporal dependence, and catastrophic forgetting represent major challenges when learning from data streams. While Streaming Machine Learning and Continual Learning (CL) address these issues separately, recent efforts in Streaming Continual Learning (SCL) aim to unify them. In this work, we introduce MAGIC Net, a novel SCL approach that integrates CL-inspired architectural strategies with recurrent neural networks to tame temporal dependence. MAGIC Net continuously learns, looks back at past knowledge by applying learnable masks over frozen weights, and expands its architecture when necessary. It performs all operations online, ensuring inference availability at all times. Experiments on synthetic and real-world streams show that it improves adaptation to new concepts, limits memory usage, and mitigates forgetting.