ProOOD: Prototype-Guided Out-of-Distribution 3D Occupancy Prediction
AI 摘要
ProOOD通过原型引导,提升3D语义占据预测的OOD检测能力,尤其针对长尾类数据。
主要贡献
- 提出ProOOD方法,融合原型引导的语义补全与尾部挖掘。
- 提出EchoOOD,结合logit一致性与原型匹配产生可靠的OOD评分。
- 在多个数据集上验证了ProOOD在ID和OOD检测上的SOTA性能。
方法论
使用原型引导的语义补全与尾部挖掘增强特征表示,并结合局部和全局原型匹配进行OOD评分。
原文摘要
3D semantic occupancy prediction is central to autonomous driving, yet current methods are vulnerable to long-tailed class bias and out-of-distribution (OOD) inputs, often overconfidently assigning anomalies to rare classes. We present ProOOD, a lightweight, plug-and-play method that couples prototype-guided refinement with training-free OOD scoring. ProOOD comprises (i) prototype-guided semantic imputation that fills occluded regions with class-consistent features, (ii) prototype-guided tail mining that strengthens rare-class representations to curb OOD absorption, and (iii) EchoOOD, which fuses local logit coherence with local and global prototype matching to produce reliable voxel-level OOD scores. Extensive experiments on five datasets demonstrate that ProOOD achieves state-of-the-art performance on both in-distribution 3D occupancy prediction and OOD detection. On SemanticKITTI, it surpasses baselines by +3.57% mIoU overall and +24.80% tail-class mIoU; on VAA-KITTI, it improves AuPRCr by +19.34 points, with consistent gains across benchmarks. These improvements yield more calibrated occupancy estimates and more reliable OOD detection in safety-critical urban driving. The source code is publicly available at https://github.com/7uHeng/ProOOD.