Streaming services, such as Netflix or Amazon Prime, are widely used. But the next wave of digital media is imminent: cloud gaming. This technology is similar to video-on-demand services. A computer game is run on a server in the cloud. The players access the server via an internet connection and receive an audio/video stream on their personal device. Players no longer have to own a powerful gaming device; instead, they just need a fast internet connection, capable of streaming large amounts of data from the cloud with low latency.
Cloud computing has the potential to elevate VR games to the next level. However, the bandwidth requirements are still challenging. A fluid VR display requires up to 10x more computational performance to generate enough pixels and enough frames per second. Traditional video transmission is easily pushed beyond its limits. Dieter Schmalstieg, head of the Institute of Computer Graphics and Vision, and his team have developed a novel method unlocking a breakthrough potential for untethered VR experiences.
Drastic latency improvements
Their method, called “shading atlas streaming”, can deliver compelling VR experiences with significantly fewer bits per second transmitted over the network. Schmalstieg explains, “We are not streaming videos, but geometrically encoded data, which is decoded on the VR headset and converted into an image.”
Latency – the temporal delay caused by signal transmission, storage or processing of data packets – is compensated for by the system. “It is physically impossible to remove all latency. But our encoding allows correct images to be predicted for a small temporal window into the future. As a result, physical latency is compensated for, and the user does not perceive any delays,” says Schmalstieg. Only a few pixel errors from mispredictions remain – too few to be perceived by the users.
An interview with Dieter Schmalstieg about inventions and how they change the world can be read on Planet Research.
Efficient use of existing hardware
In practice, it is important to be able to integrate the new technology into existing infrastructure. For this purpose, the researchers use conventional MPEG video compression to encode and transmit the data. MPEG decoding capabilities already exist in VR headsets. Therefore, Shading Atlas Streaming can be used without investing in new hardware.
Shading Atlas Streaming is generally applicable to all areas involving 3D data and VR headsets. The researchers are working with US chip manufacturer Qualcomm on commercial exploitation of their research results.
Play video
This video clip covers the shading atlas streaming
This research project is anchored in the field of expertise "Information, Communication & Computing“, one of the five strategic focal areas of TU Graz.