Modern Warfare 2 is playing with your mind
How a new rendering technique is changing the way we see the world in Call of Duty
Life moves pretty fast in a Call of Duty multiplayer match. One second you're running toward a doorway, M4 in-hand ready for anything, the next you're watching a replay of someone shooting you through a window from 200 yards away. You may however have noticed a subtle improvement in your ability to spot this sort of danger between the launch of the Modern Warfare 2 beta last month and the arrival of the full game. Sadly, it might not be completely down to your improvement as a player. You thought you played the beta to test the netcode and server performance, right? No, not quite. There was something else going on.
At a recent press event in Amsterdam, Michal Drobot, the studio head at Infinity Ward Poland, introduced a technology the studio has been slowly evolving over the past three years. It's called 'software-based variable rate shading', and it's quietly changing the way players perceive the environment – and nearby enemies – within the game.
"We developed the tech a few years ago and have been improving it through this generation of consoles," explains Drobot. "It allows us to put extra resolution and extra performance [on screen], but only where we know it counts. We want to put extra quality where you are aiming, and we want to put extra quality of resolution where points of interest are on the map. If there is an enemy on screen, we make sure that enemy has the highest quality resolution so you're never feeling like you're being hindered by the game rendering at a lower res."
Fog of war
Importantly, this is a way of improving visual clarity without requiring, say, an expensive PC graphics card (the tech is entirely software-based, hence the name), and as it's so focused on specific elements of the game environment, the refresh rate doesn't take a hit either. So when you're playing Modern Warfare 2, the engine is quietly adding resolution detail to key elements that players actively look for, especially enemies and important areas or scenic features.
But how did the team know what players would be looking at in each map? Partly because the maps are designed to allow and encourage certain play behaviours: there are always vantage points with long sight lines, there are always open areas for big gun fights, there are always tunnels or allies for quick getaways – these are key points of interest as players navigate. But there's also something else: Infinity Ward watched people play. They watched a lot.
"We know through QA playtests and additional work at the studio where people look, we track that, so we have all this data," says Drobot, "We use machine learning over many playthroughs of the game; we do hundreds of playthroughs over the course of a year, we're talking tens or even hundreds of thousands of hours. So we know which areas are going to be more interesting to players – and that way we can adjust the texture data, both in terms of what everyone in the match sees and also locally on your client."
Watching QA testers in the studio is useful, but it can't possibly replicate the behaviour of real-world players. That's where the beta came in. During the beta, the use of software-based variable rate shading to specifically highlight enemies and points of interest was introduced for the first time. "During the beta we actually tweaked this between the weekends because we wanted to see how player accuracy and behaviour changed," says Drobot. "This was useful data we could analyse and as it helped us guide the players, which is really important."
Sign up to the 12DOVE Newsletter
Weekly digests, tales from the communities you love, and more
Vitally, this was never mentioned in the run up to the beta or through any patch notes or community messaging. "When you're trying to do research on player groups you don't want to tell them what's going on," says Drobot. "You want to see the behaviour change for real. There's a lot of psychology and social engineering going into this!"
Eye, spy
Indeed, what the tech is actually doing is mimicking human vision. Our sight is not as detailed at the periphery, but our optic systems enhance certain elements such as colour, shapes and fast movement so that we have warning of potential threats. Drobot explains that the visuals in Call of Duty involve three perceptual layers: you have the photorealism of an environment that's been rendered in as much detail as possible, then over that there's a layer of cinematic aesthetics (dramatic lighting, reflections, lens flare), then there's a human perceptive layer that attempts to simulate what we'd actually see as human beings in these spaces.
In Modern Warfare 2, you're not just seeing what's actually there, the engineers are trying to replicate the way our brains process visual input. "The game knows where the players are, but we don't always know how individuals are seeing each other when they play," says Drobot. "This is because we're humans, not cameras: we have a visual system, we see things differently... so we spent a lot of time [during development] attempting to mimic what a person would see while playing the game, using different environments, different lighting effects and different character skins, and then we tried to figure out – is it actually possible for you to see a particular enemy on screen? It's a fuzzy question, but we will help you out a bit – enemies will stick out more. It's a tough call. We don't want the game to look too cartoony, but it helps, it evens out the playfield. We want a character fully dressed in white to be as visible in troublesome areas as a character in dark clothing in the same spot. But we also want to make sure it looks good."
Drobot says the engineering team will keep tweaking the tech, reacting to how players behave in both Modern Warfare 2 and Warzone 2. He concedes that like any element of game design, there's no correct way of doing things, no absolutely balanced approach. But it's really interesting how developers are now moving beyond a mere photorealistic approach to game visuals and environmental design, and more toward an area of artificial intelligence research known as player behavioural modelling, in which player actions and psychological motivations are closely observed by the game engine. "There are a lot of compromises between the teams to arrive at something that frankly is never going to be perfect," says Drobot, but then that's kind of the point, because neither is our eyesight, or our way of seeing the world.
The next time you get a five-kill streak on Farm 18, then, you'll have to think about how much of that was down to skill, and how much to the subliminal messaging of a technology designed to watch and mimic human perception.
In case you missed it: The Warzone 2 and Modern Warfare 2 Battle Pass is out now
Keith Stuart is an experienced journalist and editor. While Keith's byline can often be found here at 12DOVE, where he writes about video games and the business that surrounds them, you'll most often find his words on how gaming intersects with technology and digital culture over at The Guardian. He's also the author of best-selling and critically acclaimed books, such as 'A Boy Made of Blocks', 'Days of Wonder', and 'The Frequency of Us'.
"Cut to about 3 o'clock in the morning, I end up on the piano, the bars kicked us out, and I spent the weekend nursing a hangover": Troy Baker on the 9-hour bender that taught him how to bring Indiana Jones to life
Stalker 2 first post-launch patch is here and includes over 650 fixes for everything from quest blockers to crashes, but GSC Game World isn't done yet