Wednesday, September 3, 2014

Rendering Scene Normals in Blender

I recently started working on a game where the lighting requires that normal maps exist for all the environment art, but all the art is hand drawn. Although I've gotten used to painting rough normal maps (it's part of the style, so they can't be too perfect), I briefly tried to figure out how to render a camera relative normal map from arbitrary scenes in Blender. Although I was extremely frustrated at a number of points due to lack of Blender know-how and the time it was taking me to track down the causes of various problems I was having on the internet, I finally managed to do it. I thought it was worth sharing in case anyone else has the same problem.

Because I'm no Blender wizard, I am using a very simple scene setup consisting of a sphere (actually a metaball, because it rendered without distortion at the poles), a plane, and an orthographic camera located directly above the objects and pointed down. The render output size (Render > Dimensions) was set to 1024x1024. The aim of this setup was to produce an image of all the possible normal colours, which I could then sample from when drawing normal maps by hand. In actual fact, I had such an image already, but I wondered how it was created and so ended up fiddling around with Blender!

When you have a scene you would like to render, the first thing to do is make sure that the 'Normal' option is checked in RenderLayer > Passes. This will make sure that Blender will output an image of the scene normals, although at this stage the colours will not be correct; Rendering the scene now will produce the image shown below. The normal colours shown here are what Blender uses, but most other software will expect something very different for normal maps. To fix the colours, we need to modify the output in the node editor.

  

Open a node editor panel and make sure that you set it to compositing mode with the 'use nodes' option enabled (see image below). Without the use nodes option, you won't see anything in the node editor, and the compositor mode merely allows us to modify rendered output (known as post-processing). Once that is done, we can start to build a node network that will fix the strange colours of our rendered normals.

TIP: since we don't need to modify the 3d scene right now, run a render so that the output is visible - the output will update as we change the node network in the next section and it's helpful to see it as we work.

First, create two new 'Mix' nodes (Add (shift+a) > Color > Mix). The mix node takes two inputs and mixes them together using a blending mode similar to what you might have used in Photoshop or Gimp to produce an output. This allows us to use mix to modify the colours of our rendered image. Click on the drop down menu (unlabelled, but this is the blending mode) on the first mix node and change it to Multiply, then change the second node to Add. For both nodes, click on either one of the two colour tabs that appear left of the 'image' input labels and change the colour to neutral grey (0.5, 0.5, 0.5) - this means that we will Multiply and Add this colour to each pixel of the image that comes in from the 'RenderLayer' node. Now the nodes need to be connected together. Connect the leftmost 'RenderLayer' node's 'Normal' output to the 'image' input of the 'Multiply' node, then connect the 'image' output of that to the 'image' input of the 'Add' node. Now connect the 'image' output of the 'Add' node to the 'image' input of the 'Composite' node on the right. If you check the render output, you will notice that the colours are still wrong. To fix this, we need to add an 'Invert' node (Add > Color > Invert). Add the 'Invert' node between the 'Add' and 'Composite' nodes and the colours of the output should magically be fixed!

  

NOTE: If you are rendering with cycles, you don't need the invert node. I don't know why this is, but I recommend using Blender Internal because it seems much faster for rendering normals in this way.

However, there's something still not quite right about the result... the image looks a little bit light perhaps? What's going on? Well, after much searching around on the internet, I found out that blender uses gamma correction by default on all renders unless it is disabled manually - this even gets applied to our normals, which means we are not getting the real normal colours. This would be a disaster, if there were not an easy way to disable it. Luckily, there is so, go into Scene > Color Management and make sure 'Render' is set to 'RAW'. If the output didn't change instantly, render again and you should notice that the output is significantly darker than before.

  

Using this node setup, we can now render the normals of anything!

That's it! We are now rendering the normals of the scene relative to the camera and getting the correct colours. If there is anything that is unclear, don't be afraid to ask in the comments, and as I have found, Blender Stack Exchange is an invaluable resource, so check it out.

UPDATE: I just noticed that there is a great function that allows you to display your scene using a special material that gives you the CORRECT normal colours real-time in the viewport. First open the panel that appears on the right of the 3d view by pressing 'n'. In here there are some options under the heading 'Shading'. From within this menu, first enable 'GLSL', then check 'matcap' and click the image that appears. This will open up a menu of different materials that you can use to render your scene in the viewport. Select the material shown below, and you will have a real-time normal render in the viewport! There are also lots of other cool materials to play with.

No comments:

Post a Comment