Unity 5.x Shaders and Effects Cookbook
上QQ阅读APP看书,第一时间看更新

Normal mapping

Every triangle of a 3D model has a facing direction, which is the direction that it is pointing toward. It is often represented with an arrow placed in the center of the triangle and orthogonal to the surface. The facing direction plays an important role in the way light reflects on a surface. If two adjacent triangles face different directions, they will reflect lights at different angles, hence they'll be shaded differently. For curved objects, this is a problem: it is obvious that the geometry is made out of flat triangles.

To avoid this problem, the way the light reflects on a triangle doesn't take into account its facing direction, but its normal direction instead. As stated in Adding a texture to a shader recipe, vertices can store data; the normal direction is the most used information after the UV data. This is a vector of unit length that indicates the direction faced by the vertex. Regardless of the facing direction, every point within a triangle has its own normal direction that is a linear interpolation of the ones stored in its vertices. This gives us the ability to fake the effect of high-resolution geometry on a low-resolution model. The following image shows the same geometric shape rendered with different per-vertex normals. In the image on the left, normals are orthogonal to the face represented by its vertices; this indicates that there is a clear separation between each face. On the right, normals are interpolated along the surface, indicating that even if the surface is rough, light should reflect as if it's smooth. It's easy to see that even if the three objects in the following image share the same geometry, they reflect light differently. Despite being made out of flat triangles, the object on the right reflects light as if its surface was actually curved:

Smooth objects with rough edges are a clear indication that per-vertex normals have been interpolated. This can be seen if we draw the direction of the normal stored in every vertex, as shown in the following image. You should note that every triangle has only three normals, but as multiple triangles can share the same vertex, more than one line can come out of it:

Calculating the normals from the 3D model is a technique that has rapidly declined in favor of a more advanced one—normal mapping. Similar to what happens with texture mapping, the normal directions can be provided using an additional texture, usually called normal map or bump map. Normal maps are usually RGB images, where the RGB components are used to indicate the X, Y, and Z components of the normal direction. There are many ways to create normal maps these days. Some applications such as CrazyBump (http://www.crazybump.com/) and NDO Painter (http://quixel.se/ndo/) will take in 2D data and convert it to normal data for you. Other applications such as Zbrush 4R7 (http://www.pixologic.com/) and AUTODESK (http://usa.autodesk.com) will take 3D sculpted data and create normal maps for you. The actual process of creating normal maps is definitely out of the scope of this book, but the links in the previous text should help you get started.

Unity makes the process of adding normals to your shaders quite an easy process in the Surface Shader realm using the UnpackNormals() function. Let's see how this is done.

Getting ready

Create a new material and shader and set them up on a new object in the Scene view. This will give us a clean workspace in which we can look at just the normal mapping technique.

You will need a normal map for this recipe, but there is also one in the Unity project included with this book.

An example normal map included with this book's contents is shown here:

How to do it…

The following are the steps to create a normal map shader:

  1. Let's get the Properties block set up in order to have a color tint and texture:
    Properties
    {
      _MainTint ("Diffuse Tint", Color) = (1,1,1,1)
      _NormalTex ("Normal Map", 2D) = "bump" {}
    }
    Note

    By initializing the texture as bump, we are telling Unity that _NormalTex will contain a normal map. If the texture is not set, it will be replaced by a grey texture. The color used (0.5,0.5,0.5,1) indicates no bump at all.

  2. Link the properties to the Cg program by declaring them in SubShader{} below the CGPROGRAM statement:
    CPROGRAM
    #pragma surface surf Lambert
    
    // Link the property to the CG program
    sampler2D _NormalTex;
    float4 _MainTint;
  3. We need to make sure that we update the Input struct with the proper variable name so that we can use the model's UVs for the normal map texture:
    // Make sure you get the UVs for the texture in the struct
    struct Input
    {
      float2 uv_NormalTex;
    }
  4. Finally, we extract the normal information from the normal map texture using the built-in UnpackNormal() function. Then, you only have to apply these new normals to the output of the Surface Shader:
    // Get the normal data out of the normal map texture
    // using the UnpackNormal function
    float3 normalMap = UnpackNormal(tex2D(_NormalTex, IN.uv_NormalTex));
    
    // Apply the new normal to the lighting model
    o.Normal = normalMap.rgb;

The following image demonstrates the result of our normal map shader:

Note

Shaders can have both a texture map and normal map. It is not uncommon to use the same UV data to address both. However, it is possible to provide a secondary set of UVs in the vertex data (UV2) specifically used for the normal map.

How it works…

The actual math to perform the normal mapping effect is definitely beyond the scope of this chapter, but Unity has done it all for us already. It has created the functions for us so that we don't have to keep doing it over and over again. This is another reason why Surface Shaders are a really efficient way to write shaders.

If you look in the UnityCG.cginc file found in the Data folder in your Unity installation directory, you will find the definitions for the UnpackNormal() function. When you declare this function in your Surface Shader, Unity takes the provided normal map and processes it for you and gives you the correct type of data so that you can use it in your per-pixel lighting function. It's a huge time-saver! When sampling a texture, you get RGB values from 0 to 1; however, the directions of a normal vector range from -1 to +1. UnpackNormal() brings these components in the right range.

Once you have processed the normal map with the UnpackNormal() function, you send it back to your SurfaceOutput struct so that it can be used in the lighting function. This is done by o.Normal = normalMap.rgb;. We will see how the normal is actually used to calculate the final color of each pixel in Chapter 3, Understanding Lighting Models.

There's more…

You can also add some controls to your normal map shader that lets a user adjust the intensity of the normal map. This is easily done by modifying the x and y components of the normal map variable and then adding it all back together. Add another property to the Properties block and name it _NormalMapIntensity:

_NormalMapIntensity("Normal intensity", Range(0,1)) = 1

Multiply the x and y components of the unpacked normal map and reapply this value to the normal map variable:

fixed3 n = UnpackNormal(tex2D(_BumpTex, IN.uv_ uv_MainTex)).rgb;
n.x *= _NormalMapIntensity;
n.y *= _NormalMapIntensity;
o.Normal = normalize(n);
Note

Normal vectors are supposed to have lengths equal to one. Multiplying them for _NormalMapIntensity changes their length, making normalization necessary.

Now, you can let a user adjust the intensity of the normal map in the material's Inspector tab. The following image shows the result of modifying the normal map with our scalar values: