|Augmented Reality-based Remote Collaboration|
Augmented Reality-based Remote Collaboration
Steffen Gauglitz • Benjamin Nuernberger • Kuo-Chin Lien • Cha Lee • Matthew Turk • Tobias Höllerer
With the widespread deployment of fast data connections and availability of a variety of sensors for different modalities, the potential of remote collaboration has greatly increased. While the now ubiquitous video conferencing applications take advantage of some of these capabilities, the use of video between remote and local users is limited largely to watching disjoint video feeds, leaving much to be desired regarding direct interaction with the remote environment. Thus, teleconference-like applications have been largely successful when the matter at hand can be discussed verbally or with the help of purely digital data (such as presentations slides), but they hit severe limitations when real-world objects or environments are involved.
We describe a framework and several prototype implementations for unobtrusive mobile remote collaboration on tasks that involve the physical environment. Our system uses the Augmented Reality paradigm and model-free, markerless visual tracking to facilitate decoupled, live updated views of the environment and world-stabilized annotations while supporting a moving camera and unknown, unprepared environments.
In order to evaluate our concept and prototypes, we conducted user studies in which a remote expert helped a local user to perform a specific task (see papers for details).
Interpreting 2D Gesture Annotations in 3D Augmented Reality:
Prototype 3 (presented at VRST 2014)
Prototype 2 & user study (presented at UIST 2014)
Prototype 1 & user study (presented at MobileHCI 2012)