Refine and Purify: Orthogonal Basis Optimization with Null-Space Denoising for Conditional Representation Learning
AI 摘要
提出OD-CRL框架,优化条件表示学习中的基向量并抑制干扰,提升任务性能。
主要贡献
- 提出Adaptive Orthogonal Basis Optimization (AOBO)
- 提出Null-Space Denoising Projection (NSDP)
- OD-CRL在定制化任务上达到SOTA
方法论
利用AOBO构建正交语义基向量,并通过NSDP将embeddings投影到无关子空间的零空间,抑制非目标语义干扰。
原文摘要
Conditional representation learning aims to extract criterion-specific features for customized tasks. Recent studies project universal features onto the conditional feature subspace spanned by an LLM-generated text basis to obtain conditional representations. However, such methods face two key limitations: sensitivity to subspace basis and vulnerability to inter-subspace interference. To address these challenges, we propose OD-CRL, a novel framework integrating Adaptive Orthogonal Basis Optimization (AOBO) and Null-Space Denoising Projection (NSDP). Specifically, AOBO constructs orthogonal semantic bases via singular value decomposition with a curvature-based truncation. NSDP suppresses non-target semantic interference by projecting embeddings onto the null space of irrelevant subspaces. Extensive experiments conducted across customized clustering, customized classification, and customized retrieval tasks demonstrate that OD-CRL achieves a new state-of-the-art performance with superior generalization.