RexBERT: Context Specialized Bidirectional Encoders for E-commerce
AI 摘要
RexBERT针对电商领域,利用高质量数据和训练方法,构建高效的BERT模型。
主要贡献
- 发布 Ecom-niverse 电商领域数据集
- 提出基于 ModernBERT 的可复现预训练方案
- 训练并评估 RexBERT 模型在电商任务上的性能
方法论
采用三阶段预训练方法:通用预训练、上下文扩展和退火领域专业化,构建电商领域BERT模型。
原文摘要
Encoder-only transformers remain indispensable in retrieval, classification, and ranking systems where latency, stability, and cost are paramount. Most general purpose encoders, however, are trained on generic corpora with limited coverage of specialized domains. We introduce RexBERT, a family of BERT-style encoders designed specifically for e-commerce semantics. We make three contributions. First, we release Ecom-niverse, a 350 billion token corpus curated from diverse retail and shopping sources. We describe a modular pipeline that isolates and extracts e-commerce content from FineFineWeb and other open web resources, and characterize the resulting domain distribution. Second, we present a reproducible pretraining recipe building on ModernBERT's architectural advances. The recipe consists of three phases: general pre-training, context extension, and annealed domain specialization. Third, we train RexBERT models ranging from 17M to 400M parameters and evaluate them on token classification, semantic similarity, and general natural language understanding tasks using e-commerce datasets. Despite having 2-3x fewer parameters, RexBERT outperforms larger general-purpose encoders and matches or surpasses modern long-context models on domain-specific benchmarks. Our results demonstrate that high quality in-domain data combined with a principled training approach provides a stronger foundation for e-commerce applications than indiscriminate scaling alone.