Home

Integrating the Physical Environment into Mobile Remote Collaboration

Integrating the Physical Environment into Mobile Remote Collaboration

Steffen Gauglitz  •  Cha Lee  •  Matthew Turk  •  Tobias Höllerer

With the widespread deployment of fast data connections and availability of a variety of sensors for different modalities, the potential of remote collaboration has greatly increased. While the now ubiquitous video conferencing applications take advantage of some of these capabilities, the use of video between remote and local users is limited largely to watching disjoint video feeds, leaving much to be desired regarding direct interaction with the remote environment. Thus, teleconference-like applications have been largely successful when the matter at hand can be discussed verbally or with the
help of purely digital data (such as presentations slides), but they hit severe limitations when real-world objects or environments are involved.

We describe a framework and prototype implementation for unobtrusive mobile remote collaboration on tasks that involve the physical environment. Our system uses the Augmented Reality paradigm and model-free, markerless visual tracking to facilitate decoupled, live updated views of the environment and world-stabilized annotations while supporting a moving camera and unknown, unprepared environments.


Left: our framework for mobile remote collaboration. Right: The remote user's viewpoint, which is decoupled from the local user's viewpoint. However, the two views are registered to each other, and the live video frame is – correctly registered with the frozen frame – blended in, such that the remote user can still observe the local user’s actions.

In order to evaluate our concept and prototype, we conducted a user study with 48 participants in which a remote expert instructed a local user to operate a mock-up airplane cockpit. Users performed significantly better with our prototype (40.8 tasks completed on average) as well as with static annotations (37.3) than without annotations (28.9). 79% of the users preferred our prototype despite noticeably imperfect tracking.

Demo video


Publications

Integrating the Physical Environment into Mobile Remote Collaboration.
S. Gauglitz, C. Lee, M. Turk, T. Höllerer. ACM SIGCHI's International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI) 2012. Best Paper Honorable Mention