LocalSUG: Geography-Aware LLM for Query Suggestion in Local-Life Services
AI 摘要
LocalSUG利用LLM解决本地生活服务查询推荐的地理位置、偏好和效率挑战。
主要贡献
- 城市感知候选挖掘,注入地理信息
- GRPO算法优化LLM偏好并减少暴露偏差
- 质量感知加速和词汇修剪,降低在线延迟
方法论
提出LocalSUG框架,通过地理感知候选挖掘、GRPO算法和质量感知加速等技术,优化LLM在本地生活服务查询推荐中的应用。
原文摘要
In local-life service platforms, the query suggestion module plays a crucial role in enhancing user experience by generating candidate queries based on user input prefixes, thus reducing user effort and accelerating search. Traditional multi-stage cascading systems rely heavily on historical top queries, limiting their ability to address long-tail demand. While LLMs offer strong semantic generalization, deploying them in local-life services introduces three key challenges: lack of geographic grounding, exposure bias in preference optimization, and online inference latency. To address these issues, we propose LocalSUG, an LLM-based query suggestion framework tailored for local-life service platforms. First, we introduce a city-aware candidate mining strategy based on term co-occurrence to inject geographic grounding into generation. Second, we propose a beam-search-driven GRPO algorithm that aligns training with inference-time decoding, reducing exposure bias in autoregressive generation. A multi-objective reward mechanism further optimizes both relevance and business-oriented metrics. Finally, we develop quality-aware beam acceleration and vocabulary pruning techniques that significantly reduce online latency while preserving generation quality. Extensive offline evaluations and large-scale online A/B testing demonstrate that LocalSUG improves click-through rate (CTR) by +0.35% and reduces the low/no-result rate by 2.56%, validating its effectiveness in real-world deployment.