AR-Technology-Based Locationing System for Interactive Content

In this research, we propose an interactive display system that has a simple implementation. The system is composed of a smartphone, PC, display, and cloud service; it realizes a CAVE-like immersive system by employing simple devices. Relative location information is acquired by the user ’s smartphone and a marker (or markers). Based on this information, the system renders the appropriate image in real time.

In recent years, personal head-mounted displays (HMD) such as Oculus Rift have been distributed, and immersive virtual-world display systems, such as gaming environments, have drawn public attention. Not only HMDs but also immersive projection technology (IPT)-based displays such as the Cave Automatic Virtual Environment (CAVE) system have been developed as immersive displays. (The CAVE system is an immersive virtual reality environment where projectors are directed to several walls of a room-sized cube.) IPT display methods require displays or screens located around a user, whereas an HMD covers the user’s eyes. Both systems interactively change the displayed image based on the user’s movement and produce an immersive sensory experience.

However, both systems have several problems. IPT displaying methods require location sensors, multiple projectors, and multiple screens. These are expensive, lack portability and require a large space. Thus, the IPT display method is unsuited to personal use; its use is limited to research or large events. On the other hand, HMD realizes a personal immersive environment, and is now becoming more widespread. However, the system covers the user ’s eyes; thus, it cannot be used in everyday life. In addition, the display might induce virtual sickness owing to differences between the user ’s movement and the displayed image.

Here we propose novel but simple immersive display framework that employ smartphone, normal PC, LCD displays, and cloud service. In this study, we propose a simple, novel immersive display framework that employs a smartphone, a basic PC, an LCD display, and a cloud service. Using this framework, we create simple interactive content and evaluate the functionality of the location information system. As shown in Figure 1, a smartphone captures the displayed marker and sends the calculated relative location data to a location server in real time. An image server acquires the data and controls the displayed computer graphics. By controlling the displayed image based on location information regarding the user, the framework realizes a motion-parallax-based 3D display. By evaluating the latency of the system, we verify the functionality of the proposed framework and the capability of the framework to display interactive contents.

Publications

  • Satoshi Saga, Ryota Oki, Shusuke Kawagoe, Wanjia Zheng & Jiacheng Sun. AR-Technology-Based Locationing System for Interactive Content. In Proceedings of International Conference, HCI International 2015, 2015.

Acknowledgement