Delay Compensation for Actuated Stereoscopic 360 Degree Telepresence Systems with Probabilistic Head Motion Prediction

Tamay Aykut, Christoph Burgmair, Mojtaba Karimi, Jingyi Xu, Eckehard Steinbach

IEEE Winter Conference on Applications of Computer Vision (WACV), Lake Tahoe, USA, März 2018.


The usability of telepresence applications is strongly affected by the communication delay between the user and the remote system. Special attention needs to be paid in case the distant scene is experienced by means of a Head Mounted Display. A high motion-to-photon latency, which describes the time needed to fully reflect the user’s motion on the display, results in a poor feeling of presence. Further consequences involve unbearable motion sickness, indisposition, and termination of the telepresence session in the worst case. In this letter, we present our low-cost MAVI telepresence system, which is equipped with a stereoscopic 360° vision system and high-payload manipulation capabilities. Special emphasis is placed on the stereoscopic vision system and its delay compensation. More specifically, we propose velocity-based dynamic field-of-view adaptation techniques to decrease the emergence of simulator sickness and to improve the achievable level of delay compensation. The proposed delay compensation approach relies on deep learning to predict the prospective head motion. We use our previously described head motion dataset for training, validation, and testing. To prove the general validity of our approach, we perform cross validation with another independent dataset. We use both qualitative measures and subjective experiments for evaluation. Our results show that the proposed approach is able to achieve mean compensation rates of around 99.9% for latencies between 0.1 and 0.5s