|Integrating the Physical Environment into Mobile Remote Collaboration|
Integrating the Physical Environment into Mobile Remote Collaboration
Steffen Gauglitz • Cha Lee • Matthew Turk • Tobias Höllerer
With the widespread deployment of fast data connections and availability of a variety of sensors for different modalities, the potential of remote collaboration has greatly increased. While the now ubiquitous video conferencing applications take advantage of some of these capabilities, the use of video between remote and local users is limited largely to watching disjoint video feeds, leaving much to be desired regarding direct interaction with the remote environment. Thus, teleconference-like applications have been largely successful when the matter at hand can be discussed verbally or with the
We describe a framework and prototype implementation for unobtrusive mobile remote collaboration on tasks that involve the physical environment. Our system uses the Augmented Reality paradigm and model-free, markerless visual tracking to facilitate decoupled, live updated views of the environment and world-stabilized annotations while supporting a moving camera and unknown, unprepared environments.
In order to evaluate our concept and prototype, we conducted a user study with 48 participants in which a remote expert instructed a local user to operate a mock-up airplane cockpit. Users performed significantly better with our prototype (40.8 tasks completed on average) as well as with static annotations (37.3) than without annotations (28.9). 79% of the users preferred our prototype despite noticeably imperfect tracking.