Complementary Text-Guided Attention for Zero-Shot Adversarial Robustness
AI 摘要
提出互补文本引导注意力机制Comp-TGA,提升CLIP模型在零样本对抗环境下的鲁棒性。
主要贡献
- 发现对抗扰动会导致文本引导注意力发生变化
- 提出TGA-ZSR框架,利用局部和全局注意力约束增强鲁棒性
- 提出Comp-TGA方法,通过互补注意力机制捕捉更全面的特征
方法论
通过分析对抗样本影响,利用文本引导注意力,设计局部注意力优化和全局注意力约束模块,再结合互补注意力提高鲁棒性。
原文摘要
Due to the impressive zero-shot capabilities, pre-trained vision-language models (e.g., CLIP), have attracted widespread attention and adoption across various domains. Nonetheless, CLIP has been observed to be susceptible to adversarial examples. Through experimental analysis, we have observed a phenomenon wherein adversarial perturbations induce shifts in text-guided attention. Building upon this observation, we propose a simple yet effective strategy: Text-Guided Attention for Zero-Shot Robustness (TGA-ZSR). This framework incorporates two components: Local Attention Refinement Module and Global Attention Constraint Module. Our goal is to maintain the generalization of the CLIP model and enhance its adversarial robustness. Additionally, the Global Attention Constraint Module acquires text-guided attention from both the target and original models using clean examples. Its objective is to maintain model performance on clean samples while enhancing overall robustness. However, we observe that the method occasionally focuses on irrelevant or spurious features, which can lead to suboptimal performance and undermine its robustness in certain scenarios. To overcome this limitation, we further propose a novel approach called Complementary Text-Guided Attention (Comp-TGA). This method integrates two types of foreground attention: attention guided by the class prompt and reversed attention driven by the non-class prompt. These complementary attention mechanisms allow the model to capture a more comprehensive and accurate representation of the foreground. The experiments validate that TGA-ZSR and Comp-TGA yield 9.58% and 11.95% improvements respectively, in zero-shot robust accuracy over the current state-of-the-art techniques across 16 datasets.