Multimodal Learning 相关度: 9/10

Hi-SAM: A Hierarchical Structure-Aware Multi-modal Framework for Large-Scale Recommendation

Pingjun Pan, Tingting Zhou, Peiyao Lu, Tingting Fei, Hongxiang Chen, Chuanjiang Luo
arXiv: 2602.11799v1 发布: 2026-02-12 更新: 2026-02-12

AI 摘要

Hi-SAM通过解耦语义标记和分层Transformer结构,提升多模态推荐系统的效果,并在大规模场景下验证有效性。

主要贡献

  • 提出了解耦语义标记器(DST),解决模态间语义纠缠问题
  • 提出了分层记忆锚点Transformer(HMAT),考虑用户交互层级结构
  • 在真实数据集和大规模线上环境验证了方法的有效性

方法论

Hi-SAM通过几何对齐和粗细粒度量化解耦模态语义,利用分层RoPE和锚点Token编码层级结构。

原文摘要

Multi-modal recommendation has gained traction as items possess rich attributes like text and images. Semantic ID-based approaches effectively discretize this information into compact tokens. However, two challenges persist: (1) Suboptimal Tokenization: existing methods (e.g., RQ-VAE) lack disentanglement between shared cross-modal semantics and modality-specific details, causing redundancy or collapse; (2) Architecture-Data Mismatch: vanilla Transformers treat semantic IDs as flat streams, ignoring the hierarchy of user interactions, items, and tokens. Expanding items into multiple tokens amplifies length and noise, biasing attention toward local details over holistic semantics. We propose Hi-SAM, a Hierarchical Structure-Aware Multi-modal framework with two designs: (1) Disentangled Semantic Tokenizer (DST): unifies modalities via geometry-aware alignment and quantizes them via a coarse-to-fine strategy. Shared codebooks distill consensus while modality-specific ones recover nuances from residuals, enforced by mutual information minimization; (2) Hierarchical Memory-Anchor Transformer (HMAT): splits positional encoding into inter- and intra-item subspaces via Hierarchical RoPE to restore hierarchy. It inserts Anchor Tokens to condense items into compact memory, retaining details for the current item while accessing history only through compressed summaries. Experiments on real-world datasets show consistent improvements over SOTA baselines, especially in cold-start scenarios. Deployed on a large-scale social platform serving millions of users, Hi-SAM achieved a 6.55% gain in the core online metric.

标签

多模态学习 推荐系统 Transformer 量化 分层结构

arXiv 分类

cs.AI