Augmented Reality-based Remote Collaboration

Augmented Reality-based Remote Collaboration

Steffen Gauglitz  •  Benjamin Nuernberger  •  Kuo-Chin Lien •  Cha Lee  •  Matthew Turk  •  Tobias Höllerer

With the widespread deployment of fast data connections and availability of a variety of sensors for different modalities, the potential of remote collaboration has greatly increased. While the now ubiquitous video conferencing applications take advantage of some of these capabilities, the use of video between remote and local users is limited largely to watching disjoint video feeds, leaving much to be desired regarding direct interaction with the remote environment. Thus, teleconference-like applications have been largely successful when the matter at hand can be discussed verbally or with the help of purely digital data (such as presentations slides), but they hit severe limitations when real-world objects or environments are involved.

We describe a framework and several prototype implementations for unobtrusive mobile remote collaboration on tasks that involve the physical environment. Our system uses the Augmented Reality paradigm and model-free, markerless visual tracking to facilitate decoupled, live updated views of the environment and world-stabilized annotations while supporting a moving camera and unknown, unprepared environments.

Left: our framework for mobile remote collaboration. Right: The remote user's viewpoint, which is decoupled from the local user's viewpoint. However, the two views are registered to each other, and the live video frame is – correctly registered with the frozen frame – blended in, such that the remote user can still observe the local user’s actions.

In order to evaluate our concept and prototypes, we conducted user studies in which a remote expert helped a local user to perform a specific task (see papers for details).


Interpreting 2D Gesture Annotations in 3D Augmented Reality:


Prototype 3 (presented at VRST 2014)


Prototype 2 & user study (presented at UIST 2014)


Prototype 1 & user study (presented at MobileHCI 2012)




Interpreting 2D Gesture Annotations in 3D Augmented Reality. 
B. Nuernberger, K.-C. LienT. Höllerer, M. Turk. Proceedings of the IEEE Symposium on 3D User Interfaces (3DUI), Greenville, SC, USA, March 2016. Honorable Mention for Best Paper

In Touch with the Remote World: Remote Collaboration with Augmented Reality Drawings and Virtual Navigation.
S. Gauglitz, B. Nuernberger, M. Turk, T. Höllerer. Proceedings of the 20th ACM Symposium on Virtual Reality Software and Technology (VRST), Edinburgh, UK, November 2014.
World-Stabilized Annotations and Virtual Scene Navigation for Remote Collaboration.
S. Gauglitz, B. Nuernberger, M. Turk, T. Höllerer. Proceedings of the 27th ACM Symposium on User Interface Software and Technology (UIST), Honolulu, Hawaii, USA, October 2014.
Integrating the Physical Environment into Mobile Remote Collaboration.
S. Gauglitz, C. Lee, M. Turk, T. Höllerer. ACM SIGCHI's International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI) 2012. Best Paper Honorable Mention