A Stereoscopic Vision System with Delay Compensation for 360° Remote Reality

Tamay Aykut, Stefan Lochbrunner, Mojtaba Karimi, Burak Cizmeci, Eckehard Steinbach

Thematic Workshops 17 Proceedings of the on Thematic Workshops of ACM Multimedia 2017, Mountain View, California, USA, Oktober 2017.

Abstract

The rapid development of virtual reality systems and their increasing acceptance result in a high demand for 3D content, in particular, content that can be viewed in 360°. The acquisition of monoscopic 360° videos is straightforward and is typically done with a single camera in combination with panoramic optics or an arrangement of two 180° fisheye cameras. Stereoscopic 360° videos support the perception of depth but are considerably more challenging to capture. They require sophisticated multi-camera arrangements, which leads to heavy, bulky and expensive systems. Furthermore, they need computationally demanding post-processing. For telepresence applications, where the user wears VR glasses to get a 3D view of the scene in front of a mobile telepresence platform, such a vision system must be real-time capable. Another challenge is the network-induced delay, which leads to motion sickness. In this paper, we present a stereoscopic vision system that captures stereo video content for VR displays and consists of only two cameras, a pan-tilt-unit and a visual delay compensation algorithm. With our approach, the user is allowed to rotate his head by 360°. The proposed system compensates the perceived delay when the user rotates the head around the z-axis by streaming a larger field-of-view (FoV) than needed by the VR display. We provide an analytical solution for the required camera FoV as a function of the FoV of the HMD and the communication delay. Our experimental evaluation for typical head motions shows that the proposed system achieves a mean compensation rate of up to 95% for the tested communication delays of 0-500ms.