Do You Really Need to Know Where “That” Is? Enhancing Support for Referencing in Collaborative Mixed Reality Environments

CHI 2021

Abstract

Mixed Reality has been shown to enhance remote guidance and is especially well-suited for physical tasks. Conversations during these tasks are heavily anchored around task objects and their spatial relationships in the real world, making referencing - the ability to refer to an object in a way that is understood by others - a crucial process that warrants explicit support in collaborative Mixed Reality systems. This paper presents a 2x2 mixed factorial experiment that explores the effects of providing spatial information and system-generated guidance to task objects. It also investigates the effects of such guidance on the remote collaborator’s need for spatial information. Our results show that guidance increases performance and communication efficiency while reducing the need for spatial information, especially in unfamiliar environments. Our results also demonstrate a reduced need for remote experts to be in immersive environments, making guidance more scalable, and expertise more accessible.

Publication
Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems
Janet Johnson
Janet Johnson
Ph.D. Candidate
Danilo Gasques
Danilo Gasques
Ph.D. Candidate

My research focus is on improving remote and co-located surgical mentoring and training as well as creating intefaces for surgical guidance through eXtended Reality technology.

Tommy Sharkey
Tommy Sharkey
Ph.D. Student
Nadir Weibel
Nadir Weibel
Professor of Computer Science and Engineering

Related