Article | Taking a cinematic color grading approach to video games

Unofficially color grading Uncharted 4: A Thief’s End

I love the ‘Drakes’ series of games. They’re expertly crafted entertaining and immersive examples pushing the intersection of cinema and interactivity.

When the latest trailer for Uncharted 4 which recently came out, I was impressed with the gameplay and graphics but something was off… The color grading was perplexing… blacks weren’t black, whites weren’t white – and I doubt the talented people at Naughty Dog were going for an Instagram filter effect, so I threw the trailer up on my color grading studio to have a closer look.

Update later on Dec 16

This article got picked up by Neogaf and Kotaku. Cool, thanks!

Update Dec 16

It seems that I’m not the only one noticing this. It looks like the video might have been export compressed to 16-235 range… The talented cats at Gamersyde have updated the video with improved blacks. Naughty Dog should release another one with the proper compression range as the one on youtube is really not doing their fantastic game justice. That said, the color grading of games is an interesting topic with some film techniques listed below.


  • It needs to be said that this whole article is a bit of a blend between technical tricks to improve visual realism and some subjective art tweaking. I only shifted the colour balance in the example near the bottom where Drake is in the shade and where I say I maybe went a bit too far, the rest were just level balancing. It’s subjective stuff!
  • Also I’ve read there’s an uncompressed version of this movie, which would have been way better to use for my examples, but I didn’t know it existed at the time I messed around with it.
  • I’m no HDR freak, and this isn’t HDR. It’s about using the available range of LDR.
  • People have asked what color grading software was used, it’s the amazing Davinci Resolve.

I’ve been a CG Sup working on lighting and cameras in video games for over 15 years. I also work as a film DP and colorist and it’s interesting but not surprising how the color tools in the film world are so much more sophisticated than in the videogame world. Game engines like Frostbite, Unreal and even Unity have LUTs (Look Up Tables) gamma curves and basic color correction tools, but the workflows and information displays have nothing on the tools used in film production.

I believe there’s a market to bring film color grading tools and techniques to video games. The current game engines are so powerful and the artists so incredibly talented that photorealism is now essentially a cost of entry for AAA blockbuster titles. However, the game tools and techniques for color management are still comparatively primitive. Usually it’s the Art Director, Lighter and CG Sup all working together to herd all the art and rendering gizmos into creating a balanced frame. There’s no ‘Colorist’ credit in games, but it’s really only a matter of time before that changes?

Let’s take a look at the latest Drakes. Overall, there’s some interesting choices in regards to dynamic range. The characters and models and lighting methods are absolutely fantastic but the final frame’s levels are holding it back.

Look at how amazing the character is! That said the range on this frame needs work and the skin saturation is too high. Oversaturated skin values are a really common issue on a lot of video games. If you run your game and your reference images through some good color scopes, it’s really obvious to see where the saturation values are often misplaced on your game compared to reality.

I massaged the values to fully use the dynamic range, and reduced the skin tone saturation. To me, he’s starting to look more photorealistic. The frame is more believable and punchy.

Here’s some dark scenes, which never got dark with milky blacks.

Here’s the same frame with a little massaging to have the image use the entire dynamic range of the output.

The color grading massaged the values so a greater dynamic range of signal is used.


I got this video from youtube, with all their glorious video compression included. This isn’t an exact science, the compression will shift the blacks and degrade things. Had I had raw video, the results would be different, but they wouldn’t be so different as to invalidate these results or ideas. The game looks better than it does on youtube. Everything looks better before being compressed heavily.

The waveforms tell the story in great detail. Here’s how to read them: 0 is black and 1023 is pure white. The waveforms display the brightness and color of pixels from the left of the screen to the right. See the two bright waterfall sections in the left third of the picture? Those are the two spikes on the left third of the waveforms.

The original frame is the left waveform and the color graded one is on the right. See how none of the values even get near black or white on the original?

After color grading, the blacks now hit black and the brightest areas of the waterfall are now getting near white with no crushing or clipping.

In a way, this is like doing an Ansel Adams ‘Zone System’ exposure on the footage. Why do this? The reason you need to ‘fill the waveform’ is because TV’s and monitors and cameras only capture and display a small range of the actual depth of information that’s out there. To throw away even more of this information by not hitting the limits is just compounding the problem even further. It’s like looking at the word with 8-track cassette eyes vs 24bit uncompressed audio. Photocopying photocopies. If we can only display a limited range compared to what we’re trying to emulate, at least spend it all!

The brightness dynamic range outside on a sunny day is 1,000,000,000:1. We’re able to see approx 15,000:1. Your monitor is around 700:1 to 1000:1. So that original waveform there on the left sure is chucking away a lot of useful information! It’s using just over half of the dynamic range of an 8-bit signal – not at all how it would look if you’re really there, or even if you took a picture of that scene with your phone camera.

Photorealism is about mimicking reality. Reality has a massive dynamic range. If you’re going for photorealism, you need to spend all that limited 9.5 stops of dynamic range that this monitor you’re looking at can do, otherwise it’s going to look washed, like old film, or a heavy Instagram effect. Spend the signal to the limits.

Here’s another frame with some hot sun hitting the rocks as well as deep shadow values on the left. In real life, the dynamic range would be in the hundreds of thousands to one. Your DSLR camera shooting RAW would capture 20,000:1 or around 14 stops of dynamic range. You could tone map / color correct it into into the display space of 700:1 or about 9.5 stops of dynamic range for a nice punchy image which looks photorealistic, obviously.

In this frame, just a little over half of the dynamic range in a standard video signal is used. There’s bright sun reflections and dark shadows, so it should be higher than that. The highlights aren’t bright, the shadows aren’t dark.

After a little massaging, the very darkest of the shadows now hit black and the hottest highlights are just about at white. It’s like a grey film has been wiped away.

The original waveform on the left shows that the blacks never get black and that hot rock getting hit full blast from the sun doesn’t carry that bright of a signal.

The right waveform shows the highlight and shadows values being set to their maximum range possible for a standard display.

Cinema cameras, like the new RED Dragon have the ability to capture up to 16.5 stops of dynamic range. DSLR cameras are pushing 14.5. Negative film is around 13 stops. We want dynamic range! So many videogames throw it away though, with flat blacks and wimpy whites. Fill your waveforms if you want it to look real. HDR displays are coming, which will only exasperate this issue if not handled with the right tools and experience.

Here’s a shadowy shot with a little blast of sunlight at the top left. In addition to increasing the dynamic range, I feel the shadows are a little too warm in hue and that pulling them a little colder would help the overall presentation and balance with the sunlight.

There’s more depth and clarity. Maybe I went a tad too far with the cool values in the shadows and it’s perhaps just a bit too contrasty now also. Go too far then dial it back a bit can be a good technique as long as you remember to pull it back!

I must say I’m not wanting to come across as critical towards the talented Uncharted art team. I love the stuff you guys make, your games are absolutely amazing, and I know this game isn’t out for a while yet so I’m not trying to be negative whatsoever. This post is about exploring how film-grade color grading techniques and tools can apply to games. I believe there’s room for improvement with game engine color tools, and with bringing film techniques and the role of ‘Colorist’ to video game production.

Here’s a couple of other games put through some color grading work. An older Gears of War which wasn’t hitting blacks or whites and it had a bit of an overall blue/purple tint which I removed on the second image.

The new Halo is looking amazing. This wasn’t from video, it’s a released still image, so who knows what work has been done on it… It could very well be from marketing screenshot render-quality cranked up version of the game which runs at single digit framerates, but regardless it looks great. That said there wasn’t much room for color massage. You’ll see on the right image how the dark values sit a bit tighter and I feel it looks a bit more 3D with the blacks coming down to zero, but that’s about it.

Update Dec 17

Someone has emailed me to say that the Halo image isn’t from the game but a pre-render from Axis Animation, which has clearly gone through color and didn’t need much of anything.

Check your games on a color suite and you might be surprised how a few little tweaks can make a massive difference to the overall presentation.