AI Agents 相关度: 8/10

Planning in 8 Tokens: A Compact Discrete Tokenizer for Latent World Model

Dongwon Kim, Gawon Seo, Jinsung Lee, Minsu Cho, Suha Kwak
arXiv: 2603.05438v1 发布: 2026-03-05 更新: 2026-03-05

AI 摘要

提出CompACT,一种将观察压缩到8个token的离散tokenizer,加速世界模型规划。

主要贡献

  • 提出CompACT tokenizer,大幅降低世界模型的计算成本
  • 显著提升世界模型规划速度
  • 保持了规划性能

方法论

设计了一种离散tokenizer,将每个观察压缩成少量token,应用于行动条件世界模型,进行规划。

原文摘要

World models provide a powerful framework for simulating environment dynamics conditioned on actions or instructions, enabling downstream tasks such as action planning or policy learning. Recent approaches leverage world models as learned simulators, but its application to decision-time planning remains computationally prohibitive for real-time control. A key bottleneck lies in latent representations: conventional tokenizers encode each observation into hundreds of tokens, making planning both slow and resource-intensive. To address this, we propose CompACT, a discrete tokenizer that compresses each observation into as few as 8 tokens, drastically reducing computational cost while preserving essential information for planning. An action-conditioned world model that occupies CompACT tokenizer achieves competitive planning performance with orders-of-magnitude faster planning, offering a practical step toward real-world deployment of world models.

标签

世界模型 规划 tokenizer 压缩

arXiv 分类

cs.CV cs.AI cs.RO