Sorry, this content is only available in English. For your convenience, it is shown below in this language.


The CleAR Sight research platform allows multiple people to use a touch-enabled, transparent interaction panel and perform tasks such as working with abstract data visualizations, exploring volumetric data sets, and making in-situ annotations.

Abstract

In this paper, we examine the potential of incorporating transparent, handheld devices into head-mounted Augmented Reality (AR). Additional mobile devices have long been successfully used in head-mounted AR, but they obscure the visual context and real world objects during interaction. Transparent tangible displays can address this problem, using either transparent OLED screens or rendering by the head-mounted display itself. However, so far, there is no systematic analysis of the use of such transparent tablets in combination with AR head-mounted displays (HMDs), with respect to their benefits and arising challenges. We address this gap by introducing a research platform based on a touch-enabled, transparent interaction panel, for which we present our custom hardware design and software stack in detail. Furthermore, we developed a series of interaction concepts for this platform and demonstrate them in the context of three use case scenarios: the exploration of 3D volumetric data, collaborative visual data analysis, and the control of smart home appliances. We validate the feasibility of our concepts with interactive prototypes that we used to elicit feedback from HCI experts. As a result, we contribute to a better understanding of how transparent tablets can be integrated into future AR environments.

Research Article

Download Pre-Print

Accompanying Video

Building Instructions & Source Code

Step-by-step building instructions, additional resources and source code are available:

Step-by-Step Building Instructions


Application Source Code

Publications


Related Publications

Related Student Theses

  • Katja Krug

    Unterstützung von Augmented-Reality-Datenanalyse mittels 3D-registrierter Eingabe auf einer transparenten Oberfläche

    Katja Krug Februar 5th, 2021 until Juli 9th, 2021

    Supervision: Wolfgang Büschel, Raimund Dachselt

Acknowledgments

This work was funded by the Deutsche Forschungsgemeinschaft (DFG) under Germany’s Excellence Strategy – EXC 2050/1 – 390696704 – Cluster of Excellence “Centre for Tactile Internet with Human-in-the-Loop” (CeTI) and EXC 2068 – 390729961 – Cluster of Excellence ”Physics of Life”, as well as DFG grant 389792660 as part of TRR 248 – CPEC (see https://perspicuous-computing.science). We also acknowledge the funding by the Federal Ministry of Education and Research of Germany in the program of “Souverän. Digital. Vernetzt.”, joint project 6G-life, project ID 16KISK001K and by Sächsische AufbauBank (SAB), project ID 100400076 “Augmented Reality and Artificial Intelligence supported Laparoscopic Imagery in Surgery” (ARAILIS) as TG 70.