Multimodal Cyber-physical Interaction in XR: Hybrid Doctoral Thesis Defense
AI 摘要
提出了一个支持混合XR博士论文答辩的多模态框架,并成功进行了首次实践。
主要贡献
- 提出支持多种参与方式的XR框架
- 集成全身动作捕捉实现自然交互
- 利用WebXR实现跨平台快速访问
方法论
设计并实现了一个多模态XR框架,用于支持混合现实的博士论文答辩,并进行用户反馈分析。
原文摘要
Academic events, such as a doctoral thesis defense, are typically limited to either physical co-location or flat video conferencing, resulting in rigid participation formats and fragmented presence. We present a multimodal framework that breaks this binary by supporting a spectrum of participation - from in-person attendance to immersive virtual reality (VR) or browser access - and report our findings from using it to organize the first ever hybrid doctoral thesis defense using extended reality (XR). The framework integrates full-body motion tracking to synchronize the user's avatar motions and gestures, enabling natural interaction with onsite participants as well as body language and gestures with remote attendees in the virtual world. It leverages WebXR to provide cross-platform and instant accessibility with easy setup. User feedback analysis reveals positive VR experiences and demonstrates the framework's effectiveness in supporting various hybrid event activities.