PRIOR: Perceptive Learning for Humanoid Locomotion with Reference Gait Priors
AI 摘要
PRIOR框架通过模仿学习和自监督学习实现了在复杂地形上的人形机器人稳健运动。
主要贡献
- 提出了一种基于 Isaac Lab 的高效可复现的人形机器人运动框架 PRIOR
- 利用参数化步态生成器提供稳定的参考轨迹
- 利用自监督高度图重建从深度图像中推断地形几何
方法论
结合参数化步态生成、GRU状态估计器和地形自适应步点奖励,实现人形机器人复杂地形的稳健行走。
原文摘要
Training perceptive humanoid locomotion policies that traverse complex terrains with natural gaits remains an open challenge, typically demanding multi-stage training pipelines, adversarial objectives, or extensive real-world calibration. We present PRIOR, an efficient and reproducible framework built on Isaac Lab that achieves robust terrain traversal with human-like gaits through a simple yet effective design: (i) a parametric gait generator that supplies stable reference trajectories derived from motion capture without adversarial training, (ii) a GRU-based state estimator that infers terrain geometry directly from egocentric depth images via self-supervised heightmap reconstruction, and (iii) terrain-adaptive footstep rewards that guide foot placement toward traversable regions. Through systematic analysis of depth image resolution trade-offs, we identify configurations that maximize terrain fidelity under real-time constraints, substantially reducing perceptual overhead without degrading traversal performance. Comprehensive experiments across terrains of varying difficulty-including stairs, boxes, and gaps-demonstrate that each component yields complementary and essential performance gains, with the full framework achieving a 100% traversal success rate. We will open-source the complete PRIOR framework, including the training pipeline, parametric gait generator, and evaluation benchmarks, to serve as a reproducible foundation for humanoid locomotion research on Isaac Lab.