What is ray tracing, and is it really the future of gaming?
Breaking down one of the hottest technologies impacting gaming, and how it will affect Sony's next console
If you've been in and around the gaming industry over the past couple of years, you'll have heard about ray-tracing. There's been a lot of loose talk about how ray tracing represents the future of gaming, and future hardware will surely need to incorporate dedicated resources to handle it. But what is ray tracing, and is the massive hype bubble slowly expanding around it justified?
One indication in the affirmative is the announcement by Mark Cerny in a Wired feature that Sony's next console will support ray tracing by way of its custom AMD Navi GPU. He even introduced an interesting wrinkle - the console will be able to leverage ray tracing not just for visual effects, but also to simulate positional audio.
A world of light and shadow
Like any new consumer-oriented technology, ray tracing is partially obfuscated by all the marketing buzzwords emanating from people with a vested interest in selling it. The truth is, ray tracing isn't actually new at all, it's a technique film studios have been using for years in special effects and animation to light scenes and enable proper reflections of digital objects. Remember seeing those exploding alien warships wreathed in smoke and fire reflected in Iron Man's helmet in The Avengers? That's ray tracing at work. What's new is the promise of doing this sort of rendering in real time. It enables video games to achieve on the fly what it takes Hollywood render farms days or even weeks to accomplish, leaning on hugely expensive hardware to compute ray tracing for complex scenes with millions of individual photon beams.
Ray tracing works by simulating rays of light and the ways they interact with objects and surfaces. It models how lighting affects color or occlusion on a per ray basis. Previously, lighting was handled during rasterization, the process of translating 3D polygonal models into a 2D image built out of pixels. Lighting effects were largely 'faked’, with an engine determining how light sources in a scene would theoretically affect surfaces based on placement and shading/coloring the pixels of that surface accordingly. It's computationally efficient, so it doesn't require the kind of ridiculously powerful, dedicated hardware that ray tracing traditionally has, but it comes with a number of limitations.
Our roundup of the best graphics cards for PC gaming, to power modern triple-A titles, at the best prices you can find anywhere.
With ray tracing, developers can simulate the way light operates in the real world, something that's impossible in raster-based solutions. This means that scenes appear more naturally lit and realistic, and it also means that light sources or reflective objects that aren't in frame can still be accurately reflected in the visible scene. The explosion of an offscreen grenade can still be seen reflected in the shiny metal hull of an Abrams tank rolling towards your character, for instance, and shadows cast by other soldiers currently not in frame can be more accurately rendered.
Ray tracing works by following a beam of light from an onscreen pixel back to the 3D scene and tracking where or if it interacts with objects before reaching the light source. If it does strike an object, or if it's reflected between multiple objects, or even refracted by passing through glass or water, that data is represented in the pixel in terms of light and color.
It's also possible to utilize ray tracing in sound design as Mark Cerny suggests, particularly if you're looking for a faster, cleaner solution than more traditional methods provide. If you treat sound waves as much smaller rays you can model them much the way ray tracing models light, drawing them from the source to the end user and judging where they interact with objects in the environment. The difficulty is that sound waves are generally much larger than waves of light, reaching up to ten meters or larger, while the wavelength of light is measured in nanometers, so modeling them as rays will inevitably cause inaccuracies. It is certainly possible, however, and would be computationally more efficient that most alternative solutions.
Sign up to the 12DOVE Newsletter
Weekly digests, tales from the communities you love, and more
The hardware narrative
It's likely that you first heard of ray tracing (despite seeing it on screen in films for years) when Nvidia started touting the ray tracing capabilities of its RTX 20-series of cards. Nvidia made a tremendous amount of noise about how its RT cores would enable the next generation of GPUs to bring incredible real time ray tracing to video games for the first time. Largely it was a ploy to justify the extremely high prices of the new cards, but it wasn't all marketing hype - it did represent a pretty amazing incredible technical achievement, allowing modern gaming PCs to do in real time what took those Hollywood studios several orders of magnitude longer.
Reception of those 20-series cards have been mixed, and sales have been tepid, but perhaps more importantly, Nvidia's dominance of the ‘ray tracing in games’ narrative has begun to slip. There have been a number of stumbling blocks, the first and most important of which are how few games currently support ray tracing and how, even in those titles that do, it doesn't make a glaring, immediately noticeable impact on graphics and presentation. This doesn't come as a huge surprise, of course - most new graphics technologies, like the recent HDR renaissance, take some time to be properly rolled out and implemented, but it does look as though Nvidia was a bit too far ahead of the curve and it has begun losing its ray tracing preeminence in the interim.
First there was the news that RTX cards weren't necessary for ray tracing, as demonstrated by a CryEngine demo. Then, Nvidia itself announced it was bulking out the GTX line with Turing technology but without dedicated ray tracing hardware, and finally that it was bringing ray tracing support to GTX cards through a driver update. And now, rumours are widely circulating that AMD will soon begin rolling out its own cards with dedicated ray tracing support that could match or exceed the RTX line in terms of performance. With AMD reportedly working with both Sony and Microsoft on the next generation of consoles (likely providing discrete, custom versions of their Navi architecture), word that AMD chips will soon pack dedicated hardware to support ray tracing means the next generation of consoles will likely also be jumping on the RT bandwagon.
Lighting the way forward
All of this isn't great news for Nvidia, at least in the short term, but great news for ray tracing enthusiasts. Broader hardware support means that doing the work to build ray tracing tech into games will look much more appealing to developers, because there will be an audience able to appreciate the results. And even for Nvidia, as ray tracing becomes more ubiquitous, so will sales of its RTX hardware, especially if the company is able to compress prices to accelerate mainstream sales.
It's also good news for gamers at large. Ray tracing may not be making huge waves in a practical sense now, in large part because current support feels a bit rushed or tacked on, but as we see games constructed from the jump with ray tracing support in mind, the final products will start looking a lot more impressive. In countless demos, first from Nvidia and now from CryEngine and Unity (the games engine that recently incorporated ray tracing tools), we've seen the potential of ray tracing and, properly implemented, it's as stunning as the marketing would have you believe.
The takeaway is that ray tracing is more HDR than 3D. It's not a gimmicky, flash-in-the-pan tech that will fail to gain a foothold and exit the conversation in under a year. It really is an important part of the future of games, of ensuring that the next generation of games look closer to reality than ever before, and being able to deliver it in real time really is a stunning innovation. It's an inevitability, and the main question around ray tracing is less ‘if’ than ‘when’.
Alan Bradley was once a Hardware Writer for GamesRadar and PC Gamer, specialising in PC hardware. But, Alan is now a freelance journalist. He has bylines at Rolling Stone, Gamasutra, Variety, and more.
As Sony's reported plans to acquire Dark Souls and Elden Ring developer's parent company get flagged as "unclear information" by Tokyo Stock Exchange, Kadokawa confirms "no decision has been made"
Former Blizzard boss says "the only thing bigger" than Sony buying FromSoftware would be if it bought Valve or Nintendo