Article | Lighting: From concept to final frame

I illustrate the lighting and color post production process used on Homeworld: Deserts of Kharak

Disclaimer

Below are images from the game which include locations from the later missions of the Homeworld: Deserts of Kharak campaign.


Concept Art

We’re lucky to have such a strong concept art team and because of that almost everything starts there. Rob, Aaron, Brennan and Cody painted so many beautiful images of landscapes being crawled over by giant vehicles at all times of the day and night, in all these wonderful variations.

The BBI archives have hundreds of images spanning years of development. It’s quite a treat to wander through all the amazing art and pull out images which nail a particular mood and time of day.


Colour Boards

We assembled a collection of images which stood out as the best candidates to build our levels around, taking into consideration the time of day, the colour palettes and how they would all work together. We also added photographs of dunes at various times of day so we could see how the real thing responded under all sorts of different lighting conditions.

Below is our concept colour board for the campaign, missions M01-M13. However this is really just the beginning of the process, it’s quite a challenge to actually make the game look as good as the concept art!

Some changes and adjustments would have to be made to best suit the capabilities of the game engine, but it’s important to start things with a colour guide as it will inform so many decisions when creating the level assets.


In-Engine First Pass

There are a lot of in-engine art, textures, shaders and render settings which are all separate but have to be dialed with love and precision to make the game look anything like our target concept art. None of the systems know anything about one other, so their relationships and intermixing has to be all tuned by eye. So often we’d get a few things working well together, but one system just wouldn’t click and we’d have to come a the problem from a different angle.

Occasionally we would hit a dead end with a specific component and have to rewrite a new system to give us the desired results. Our fog system initially worked out quite well, but it started to fall behind after we got all the other systems up and tuned forcing us to design and create a new one.

Many people will probably look at a level and see a nice sunny sunset without realizing that it’s created with all these different systems which don’t have any connection to each other. Each control is like an instrument requiring tuning so that they’re all playing the same song.

The first step was to try and get the game as close to the concept art as possible using the in-engine controls. This required level-specific assets such as ground texture maps, the skybox, the fog, real-time lighting and baked lighting. There’s a few more rendering controls but they’re kept at ‘zero’ for now and only used near the end of the process, which will be explained later.

A fantastic aspect of the Unity engine is the Asset Store community and all the goodies and gizmos available there. Often we would experiment with an idea and find an asset on the store which could help with the execution. ‘Expensive’ ideas could be prototyped and explored very cheaply by grabbing something already built. There was no need to code an entire lens bloom system to test out a particular look as you can just get it off the Asset Store store for $10. If the prototype worked, we would often then write our own system to best integrate with our game, but what an amazingly broad and deep ecosystem to have available.

Asset Store components we kept and used in the final game were:

Here is a look at one of our lighting panels for M11. We used the PBR rendering in Unity 5.2 as our foundation and custom wrote many of the other visual components like fog, particles, the terrain shader, lighting controls, etc., in addition to the above Asset Store items.

Here’s our concept art target for M13:

This was our first crack at lighting it in-game. Quite lacking in spirit from our target isn’t it? We have our work cut out for ourselves.


The Colour Grading Iteration Loop

One can easily see a number of things which need to be done to the game right away when the game image is put beside the concept art. It’s really clear that the skybox, fog and shadows all need a lot of tuning.

We added some analysis data to help our eyeballs out with extracting and identifying the delta between our game and the visual targets. We used Davinci Resolve for all our colour grading, and as our editing program for some of our release videos.

This is the waveform analysis of the above image to see exactly what the differences between the game and concept images are. You can clearly see the split down the middle with the game on the left and the concept art on the right.

Look how the concept art has so much more tonal variance – the fatter red green and blue areas on the right half mean more subtle differences in colour and intensity. Our game was close, the red green and blue bars somewhat line up but our game was a little thin.

Waveforms are read like this: Dots near the top of the waveform represent bright areas. Dots near the bottom are dark. The waveform monitor directly corresponds to the image from left to right, so areas in the left of the image are represented on the left of the waveform, etc.

See the six dark vehicles at the bottom right of the image above? They are represented on the waveform by those lines going almost to zero at the bottom right of the waveform. Dots are placed on the waveform based on their screen position horizontally, then vertically on the monitor based on their luminance, for each of the red green and blue channels.


Variations

While in a colour grading environment, we would experiment with slight variations on the overall theme to see if anything promising was lurking in the neighborhood.

The iteration loop is typically much shorter to mess around with look variants in a colour grading program than it is to do in-game. Also, by adjusting the game renders to the concept art it very specifically defines what has to be done in-engine to get the visuals aligned. For each rendering component we could derive a clear plan of change to get it closer to our visual target.


Reference!

We looked at a lot of images of dunes, desert and sand. There’s always so much to learn about the behavior of surfaces under different lighting conditions. We’re using relatively simple in-game lighting and rendering techniques which try to emulate the beautiful and massively complex systems in real life. Since the typical gamer PC can’t compute every photon in a 20 by 20 kilometer world, we would have to computationally cut some corners!

It’s all about the end result and one must cheat, steal and fake your way towards approximating the sophistication of what actually happens in the real world – in an unlimited dynamic range – with a system which can render 30+fps to a screen with a very limited dynamic range. Trying to make the latter look anything as good as the former is a massive exercise doing the most with the least. Evaluation and understanding of reference images will enlighten you on what the best key aspects are to emulate.

Despite all the technology, it always boils down to your eyes and an understanding of colour and composition fundamentals. You can do so much with so little if you know the right notes to hit.

Our initial attempts at sand dunes weren’t convincing. Initial attempts at basically anything rarely are. We had to iterate on our terrain shader, the textures and the lighting many many times to get the shadow, terminator, diffuse and specular components looking right. Go for those areas first and get them in the ballpark, if you can’t make those key components get close to your reference there’s not much point moving forward with any other kind of visual icing. Fancy lens effects won’t save the day if your shadows, mid tones and highlights aren’t working.

Here’s a screengrab of the game from October 2015. You can see the rampant saturation issue, the lack of bounce lighting, the non-existent haze or specular highlighting… A large list of things to address.

I cannot stress how important reference images are, not just to gather some up and perhaps print them out to put on the wall, but to really look at them and deconstruct what’s going on. Study the colour in the shadows, the behavior of the termination line, the saturation roll off of the specular highlights. Understanding these things is crucial to being able to emulate them.

We would put our reference onto the scopes in Davinci to see exactly what was going on. The relationship between luminance and saturation was something we identified as being a key component of sand dune believability. We had problems with highlights being too saturated and it gave this synthetic gamey render look to everything, which is a pretty common render engine issue. Over-saturation is a very common problem with light and texture saturation multiplying into ridiculousness. Our final level colour maps had 50% of the saturation intensity which they started out with.

So this light->grade->analyze->repeat loop would be iterated over and over for each level. With every new version things would get closer to the reference and new problems would arise. Once working skyboxes would need repainting. Fog and atmospheric haze was continually being tuned, almost every day for a few weeks near the end of the project.


Fine Tuning

This level is starting to look fairly good but there was just something not quite right about the shadows. They didn’t seem to be capturing the scattered bounce light of the sky and it was overall a little bit monochromatic.

We were all too aware of the fact that our entire game took place on a dusty planet, so we were really sensitive to the ‘everything is brown world’ issue. It was very important to us that the viewer didn’t get bored of the levels or the visual pacing, especially because the game plays out on planet which is mostly desert.

When things were not quite clicking even though though they were close to the targets, we would put our game renders next to similar reference images of which had subtle variants in the colour palette.

This reference image had numerous similarities to our level but it also had this really beautiful dusty blue/purple thing going on in the shadows.

Tuning the game to match the reference gives us this new, far stronger image. It’s in the spirit of before but much more sophisticated and interesting.

Looking at the waveforms -our game on the left, reference on the right – one can see that the highlights and mid-tones could be potentially similar (need brightening) but the shadows need work in the colour department.

Now look at the waveforms, they’re quite close. With a bit of practice you can colour grade images to match without even looking at them, only looking at the waveform monitor.

The final image is much stronger, and now we know exactly what to do in-engine to get it there.


Experimentation

There were some dead ends along the way. It’s good to try out adventurous ideas but they definitely aren’t all going to be winners and that’s ok as it is part of a healthy process to allow for experiments and mistakes.

Sometimes the great ideas are buried under crazy ones.

Exploring ideas around sunsets we came across this image, which captivated us due to the interesting sky hues.

A quick grade of our M11 gave us… something which wasn’t working out at all! Good thing that entire experiment only took a few minutes.


Dynamic Range

Values from very dark to black require careful handling. Blacks need to actually hit black in order to employ the entire dynamic range and not look washed out, yet also not go too far and therefore create an overly ‘crushed’ look with large areas lacking any detail.

We only have 8 bits of colour information over 3 channels which we can use with most current monitors and displays. That’s a heck of a lot less than what our eyes can see, so it’s fantastically important to ‘spend’ the entire dynamic range possible otherwise you’re just throwing quality away.

This is an earlier version of M02 with darks not quite dialed in. The washed out look was due to blacks being only a few percent above zero. Sometimes blacks do want to be lifted, to simulate atmospheric dispersion or haze, etc., but in most cases you want the closest to the camera darkest values to actually hit zero on a waveform.

The final image:

This is the waveform of the washed out image, notice the black values have clamped out into a line but that line isn’t sitting at zero.


Spectral Plots

Colour relationships need careful arranging also. For most levels we used a complimentary palette scheme with variants around the warm and cool values set in conjunction with the time of day and the visual pacing from level to level.

The below image does a dot per pixel of the image on a circular colour wheel. You can see all the dark blues forming a line straight across to the warm lights in the Boneyard facility of M02. It’s no coincidence that this spectral plot is in a line, as lines across a colour wheel represent a complimentary colour palette.

We tuned the blues of the shadows and the warms of the lights to be very specific hues to get this relationship correct. This is something you can do with your eyes with practice – we didn’t do spectral plots of all the levels – but it’s fun to check and see how things turned out

Here’s a plot of a near-final M02.


LUTs: Look Up Tables

After repeating this iteration loop, we’d get to the point where we just couldn’t get the game any closer to our targets using the main in-game lighting controls. This is now a good time to employ the use of LUTs on the game and cinematic cameras.

LUTs do a colour remap process where every pixel in the final render is remapped to a new colour based on a texture hue lookup. The process goes like this, the game does all it’s rendering calculations, it puts a pixel into render buffer, then looks up that hue in the reference texture to see what that colour should now become. Say a blue pixel goes down, it looks up into the reference texture and decides that it’s not that blue, it should be this blue.

The LUTs are created by applying the very same colour grading adjustments used on the game screenshots. The LUTs extract the colour corrections used to make the game match the reference or to super fine tune the black and white levels. After colour grading the game screenshot, you then apply that same grade to the base LUT texture and then use that specific LUT on every camera in that level.

LUTs are amazing for that final bit of polish where you can do some very subtle but sophisticated adjustments. Sure you can use them to do aggressive things to the colours, but their most powerful and best-looking application is when they’re used at the very end, after you’ve gotten the game to look as good as possible without them. They pull the image best when used the least. Get the game looking great first.

Here’s the base LUT texture before color grading for particular level. It’s simply a texture which has incremental spread of all the possible colours.


Conclusion

After countless adjustments, extensive chin scratching and endless knob tweaking we landed here with our final colour board for DoK.

We put a lot of care and love into the lighting for Homeworld: Deserts of Kharak and hope you enjoyed this summary of our process.

/A