Abstract
Designing low end-to-end latency system architectures for virtual
reality is still an open and challenging problem. We describe the design,
implementation and evaluation of a client-server depth-image
warping architecture that updates and displays the scene graph at
the refresh rate of the display. Our approach works for scenes consisting
of dynamic and interactive objects. The end-to-end latency
is minimized as well as smooth object motion generated. However,
this comes at the expense of image quality inherent to warping techniques.
To improve image quality, we present a novel way of detecting
and resolving occlusion errors due to warping. Furthermore,
we investigate the use of asynchronous data transfers to increase
the architectures performance in a multi-GPU setting. Besides
polygonal rendering, we also apply image-warping techniques to
iso-surface rendering. Finally, we evaluate the architecture and its
design trade-offs by comparing latency and image quality to a conventional
rendering system. Our experience with the system confirms
that the approach facilitates common interaction tasks such as
navigation and object manipulation.
Papers and Documents
- Smit, F. A., van Liere, R., Beck, S., Fröhlich, B.
A Shared-Scene-Graph Image-Warping Architecture for VR: Low Latency versus Image Quality
Elsevier Computers & Graphics, 2009.
[preprint]