X

VR games can look amazing with this game-changing imaging tech

New displays and something called foveated rendering could make a world of difference.

Ian Sherr Contributor and Former Editor at Large / News
Ian Sherr (he/him/his) grew up in the San Francisco Bay Area, so he's always had a connection to the tech world. As an editor at large at CNET, he wrote about Apple, Microsoft, VR, video games and internet troubles. Aside from writing, he tinkers with tech at home, is a longtime fencer -- the kind with swords -- and began woodworking during the pandemic.
Ian Sherr
6 min read
Scott Stein/CNET

The first time you try VR, you'll likely have the same reaction I did: It feels pretty cool to be enveloped in a computer-generated world that puts you inside an epic spaceship battle or scuba diving next to a giant blue whale.

But after spending a few minutes adjusting to virtual reality , you realize it doesn't look all that much like reality. The image quality isn't usually even as good as on an Xbox or PlayStation console. The light coming through those tree branches looks wrong. The reflection from the train window feels off. Zombies really don't wear boxy clothing and move like that (at least I don't think they do).

Zombies aside, that sense of unreality may soon go away, thanks to new imaging technology called foveated rendering and displays.

It's named after the fovea, the place in your eye where focus is most fully concentrated, as opposed to the periphery, where stuff is blurry and unfocused.

Our visual anatomy has inspired digital technology that uses sensors to determine where you're looking. It then tells a computer to work harder on the areas you're focused on and take a break when working on places you're less interested in.

The goal is to cut down parts of the computer's workload by half, at least, in some cases. And what that means is developers will either be able to save power and offer longer battery life when you're on the go, or dramatically increase the quality of the images you see in a game without requiring you to upgrade to more more powerful chips for your VR adventures.

foveated
Enlarge Image
foveated

A demonstration of what foveated rendering can look like. On top, a normal image. On bottom, a simulated foveated image, with a focus on the clock in the upper right.

Nvidia

Unless you're talking to a nerd like me, you may never hear about foveated tech. Still, it's poised to make serious waves in the VR world. The zombies could look more convincing, the reflections might appear more realistic and the light shimmering through that tree may feel more natural.

That means the next generation of entry-level virtual reality devices, like Samsung's $129 Gear VR or Google's $99 Daydream View headsets, won't seem quite so underpowered. They might not fall behind techwise as fast, either.

"We can create an experience that is really compelling and intuitive and natural that takes the virtual experience inside the headset to an entirely new level," said Henrik Eskilsson, CEO of Tobii, a company that makes eye-tracking sensors used in Dell , Acer and MSI laptops . He says the technology will also be in many next-generation VR headsets coming in the next two years. (Tobii says he can't name the coming products.)

This isn't some pie-in-the-sky idea. The $199 Oculus Go all-in-one headset from Facebook's Oculus VR division has a version of foveated smarts running in it today, helping to make games like the spaceship battle game Anshar Online look that much more epic.

"It's not just a $199 low-cost entrant," said Madhu Muthukumar, who's helping head up the team building software for Oculus. "It's actually really premium where it counts."  

Or rather, where you can see it.

New techniques like foveated rendering are just the latest example of how fast the world of VR is advancing. Despite decades of development, including a boom and bust in the 1980s and '90s propelled by visionaries like Jaron Lanier and game makers like Nintendo , the smarts behind VR have only just begun to change enough to offer something you might actually want to own.

Credit the current renaissance to the smartphone in your pocket, and the sensors that detect when you're looking at it in portrait or landscape mode, whether you're facing north or south or even if you're in the basement or top floor of a building. Because those sensors are produced in bulk, the cost per sensor has dropped. That lets entrepreneurs put them in everything from kid GPS trackers to more affordable drones and, yes, VR headsets.

In this case, eye tracking is powered by sensors like the one  Apple uses to detect your face on the iPhone X for logging in. And industry experts say foveated rendering is a vital step to solve nagging issues like battery life and poor visuals.

It's also showing up in AR headsets. Both the $2,295 Magic Leap and $3,500 Microsoft HoloLens 2 have eye tracking technology built in. In both cases, though, eye tracking is there to help people more easily identify things they're looking at, rather than make better visuals. Still, it's an important addition, according to Magic Leap and Microsoft.

As eye tracking spreads across the industry, it will also render many of today's headsets obsolete, forcing early adopters to upgrade. Even so, the result could make VR far more appealing

"I can't imagine virtual reality, and especially next-generation smart glasses, without this type of technology," said CNET's Scott Stein after he tried the technology.

How it works

It turns out that the area we as humans can see is actually quite limited. Our focus only happens on an arc of a few degrees, though we can gaze about 55 degrees in either direction, according to the National Institutes of Health

That's part of why a VR headset like the $399 Oculus Rift display computer images on a screen with 110 degrees or more of view, creating large crisp and clear images because you might turn your gaze somewhere other than the center of the screen.

Let's say you're looking forward down an empty hallway of a spaceship, and suddenly you hear a noise to the right that could be the alien hunting you. You don't want to make any sudden moves, so you turn your eyes to the right but keep facing forward. The screen needs to be there to keep you immersed.

This is where foveated rendering comes in. By using a mix of infrared light and sensors following your eyes, the headset can track where your 55-degree gaze is at all times. That allows the computer to cut the quality of everything else until you need it. If you focus your gaze to the right, everything to the left can look worse. Since your eyes won't see it, the computer can do significantly less work than it did before.

"Not only will it be useful, it will be a real necessity," said David Luebke, who's been studying this technology for nearly two decades. He's now a VP of graphics research at chipmaker Nvidia.

There are still some kinks to work out. If you're looking at light shimmering through a tree or in between white picket fences, it's hard to get the visuals to be convincing even with this cool new tech, Luebke says. That may seem like quibbling, but if the visuals don't come across convincingly, our brains tell us something's wrong. That breaks the sensation of immersion that VR is supposed to be all about.

"The old joke is that we have tiger detectors," he said. "We're very sensitive to motion and flicker on our periphery, and it's super distracting."  

Changing the game

I got my first taste of foveated graphics when Tobii's CEO and his team had me try a VR headset laden with his company's sensors.

Once inside, they showed me a demonstration of how precise eye tracking can be. They loaded me into a grassy yard filled with metal bottles on stumps. Eskilsson asked me to pick up a virtual rock and toss it at a bottle to knock it over. I missed every time.

Then his team turned on the eye tracking, which aimed the ball for me. All I had to do was look at the bottle and throw. Suddenly, I was hitting them so well I thought the demo was rigged. So I went back to working without the tracking -- and failed again.

To show off foveated rendering, they put me on an alien planet next to a box with switches. I read the markings and flipped the switches while we talked. What I didn't know was that about half the screen around me was blurry. Everything in my area of focus, though, was perfectly sharp.

If the demonstrations are any indication, eye-tracking technology promises to make big strides helping VR look better and run better on cheaper devices.

"When it works, you just don't notice it," Eskilsson said. Instead, everything in VR will just look better.

Finally.

First published June 13, 2018, 5 a.m. PT.
Update, Feb. 24, 2019, at 9:30 a.m.: Adds details about Magic Leap and HoloLens 2 now using eye tracking.