LLM Reasoning 相关度: 6/10

Refine and Purify: Orthogonal Basis Optimization with Null-Space Denoising for Conditional Representation Learning

Jiaquan Wang, Yan Lyu, Chen Li, Yuheng Jia
arXiv: 2602.05464v1 发布: 2026-02-05 更新: 2026-02-05

AI 摘要

提出OD-CRL框架,优化条件表示学习中的基向量并抑制干扰,提升任务性能。

主要贡献

  • 提出Adaptive Orthogonal Basis Optimization (AOBO)
  • 提出Null-Space Denoising Projection (NSDP)
  • OD-CRL在定制化任务上达到SOTA

方法论

利用AOBO构建正交语义基向量,并通过NSDP将embeddings投影到无关子空间的零空间,抑制非目标语义干扰。

原文摘要

Conditional representation learning aims to extract criterion-specific features for customized tasks. Recent studies project universal features onto the conditional feature subspace spanned by an LLM-generated text basis to obtain conditional representations. However, such methods face two key limitations: sensitivity to subspace basis and vulnerability to inter-subspace interference. To address these challenges, we propose OD-CRL, a novel framework integrating Adaptive Orthogonal Basis Optimization (AOBO) and Null-Space Denoising Projection (NSDP). Specifically, AOBO constructs orthogonal semantic bases via singular value decomposition with a curvature-based truncation. NSDP suppresses non-target semantic interference by projecting embeddings onto the null space of irrelevant subspaces. Extensive experiments conducted across customized clustering, customized classification, and customized retrieval tasks demonstrate that OD-CRL achieves a new state-of-the-art performance with superior generalization.

标签

条件表示学习 正交基优化 零空间去噪

arXiv 分类

cs.AI cs.CV cs.LG