Shaders in Unity are not difficult. Part 2 – diffuse shading

Let’s understand shaders

Hello everyone! I am grateful to everyone for the comments and comments on the previous article. Thanks to all of us, we are filling the Internet with accessible knowledge and this is really cool.
Today we continue to deal with shaders, namely working with lighting. Let’s take a look at the Lambert lighting type, get acquainted with diffuse shading, and, as usual, write and analyze an AD (Ambient Diffuse) shader.

Theory

So let’s start our acquaintance with the Lambertian reflectance type of lighting.

Lambert lighting is lighting, the reflection of which determines the ideal mirror surface. For the observer, the apparent brightness of the Lambert reflection is the same regardless of the observer’s angle of view. Technically speaking, the brightness of a surface is isotropic and the intensity of illumination obeys Lambert’s law. Lambert lighting is named after Johann Heinrich Lambert, who introduced and described the concept of ideal diffuse lighting in his 1760 book Photometry.

– Wikipedia

Cool, cool, but not clear. Well, let’s look at examples. Let’s start with the fact that in the world around us we do not see the objects themselves, we see the light emanating from the objects. Objects around us can both emit light and reflect it. We are mainly dealing with white light. In our case, we will represent light as an electromagnetic wave with a color gamut. When white light hits an object, some of the light will be absorbed and some will be reflected. What we see as the color of the object is the remaining reflected color of the white light.

Light absorption and reflection
Light absorption and reflection

Transferring all this to Unity, this physical process can be broken down into the following components:

  1. The light source emits light

  2. Light falls on an object, bouncing and scattering.

  3. The reflected light hits the camera and we get the picture.

In Unity, a light source is described by a point in space, and the power (or quantity of light) is described by the illumination parameter (Intensity). Since the angle of view does not matter in the Lambert lighting model, we will see the same color from all sides.

Coming back to reality, consider an example of lighting an area with a curved surface. When a ray of light hits the surface of a smooth object, the light is reflected. Since the rays of light are parallel, then in the case of a smooth object, the reflected rays will also be parallel. This regular reflection is called unidirectional reflection or specular reflection of light. But the smoothness of objects is relative, and usually presents a rough and uneven surface. In this case, the incident rays are parallel, and the reflected rays will be scattered in all directions. This phenomenon is diffuse reflection of light.

Reflection of light on a curved surface
Reflection of light on a curved surface

As you can see, in areas with sunlight falling at a right angle, the light intensity will be higher, while in areas where the light does not fall in a straight line, the illumination will be lower. The same scenario should be repeated for the diffuse shader.

Diffuse lighting characteristics:

  1. The illumination intensity of a pixel does not depend on the viewing angle.

  2. The light intensity depends on the angle of incidence of light on the surface.

So, we need to understand what it takes to calculate the color of each pixel at each point. In Lambert’s model, the surface of an object is described as a surface with perfect reflection (only diffuse reflection occurs). Let’s look at the formula for calculating ambient lighting:

intensityIndirectionDiffuse = rc * ia; 

Formula for calculating directional light:

intensityDirectionDiffuse = rc * id * max(0,dot(N, L));
  • rc (reflection coefficient) – the reflection coefficient of the material;

  • id (intensity direct) – intensity of directional light;

  • N is the unit normal vector to the vertex;

  • L is the unit normal vector of the incident light;

  • dot – dot product of vectors.

The formula for calculating directional light is slightly different from that for ambient light. When calculating the reflection of ambient light, its direction does not matter, since the ambient light has no direction (more precisely, it is directed to all sides). Therefore, the ambient light is the same for each vertex. When calculating a directional light, its direction matters. The scattering of directional light depends on the distance to the normal. The closer to the normal, the stronger the scattering. The intensity of diffuse reflection of directional light corresponds to Lambert’s law of cosine. Thus, we have the formula:

intensityDiffuse = rc * ia + kf * id * max(0,dot(N, L))

It is not my goal to burden the article with a huge amount of theory. In my opinion, all the basic points have been considered, so I propose to finish with the theory and move on to practice.

Practice

So, having figured out a little about how light works in Unity, and how we can apply it in a shader, let’s move on to writing a shader. Let me remind you that we will write an AD (Ambient diffuse) shader

Complete shader code
Shader "Chernov/Diffuse"
    {
    Properties
    {
        _MainTex ("Main Texture", 2D) = "white" {}
 
        [Header(Ambient)]
        _Ambient ("Intensity", Range(0., 1.)) = 0.1
        _AmbColor ("Color", color) = (1., 1., 1., 1.)
 
        [Header(Diffuse)]
        _Diffuse ("Val", Range(0., 1.)) = 1.
        _DifColor ("Color", color) = (1., 1., 1., 1.)
     }
 
    SubShader
    {
        Pass
        {
            Tags { "RenderType"="Transparent" "Queue"="Geometry" "LightMode"="ForwardBase" }
 
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
  
            #include "UnityCG.cginc"
 
            struct v2f {
                float4 pos : SV_POSITION;
                float2 uv : TEXCOORD0;
                fixed4 light : COLOR0;
            };
 
            fixed4 _LightColor0;
            fixed _Diffuse;
            fixed4 _DifColor;
            fixed _Ambient;
            fixed4 _AmbColor;
 
            v2f vert(appdata_base v)
            {
                v2f o;
 
                // Clip position
                o.pos = UnityObjectToClipPos(v.vertex);
 
                // Light direction
                float3 lightDir = normalize(_WorldSpaceLightPos0.xyz);
 
                // Normal in WorldSpace
                float3 worldNormal = UnityObjectToWorldNormal(v.normal.xyz);
 
                 // World position
                float4 worldPos = mul(unity_ObjectToWorld, v.vertex);

                // Camera direction
                float3 viewDir = normalize(UnityWorldSpaceViewDir(worldPos.xyz));
 
                // Compute ambient lighting
                fixed4 amb = _Ambient * _AmbColor;
 
                // Compute the diffuse lighting
                fixed4 lightTemp = max(0., dot(worldNormal, lightDir) * _LightColor0);
                fixed4 diffuse = lightTemp * _Diffuse * _LightColor0 * _DifColor;
 
                o.light = dif + amb;
               
                o.uv = v.texcoord;
                return o;
            }
 
            sampler2D _MainTex;

            fixed4 frag(v2f i) : SV_Target
            {
                fixed4 c = tex2D(_MainTex, i.uv);
                c.rgb *= i.light;
                return c;
            }
 
            ENDCG
        }
    }
}

As usual, we will parse the code line by line with comments. According to the Unity shader syntax, a shader starts with a name declaration, in our case it is Shader "Chernov/Diffuse"… This is followed by a block of shader properties.

  • _MainTex ("Main Texture", 2D) = "white" {} – the main texture of the shader (by default, the texture is white);

  • _Ambient ("Intensity", Range(0., 1.)) = 0.1 – the intensity of the ambient light (in the range 0 – 1);

  • _AmbColor ("Color", color) = (1., 1., 1., 1.) – ambient light color (default – white);

  • _Diffuse ("Val", Range(0., 1.)) = 1. – intensity of diffuse lighting (in the range 0 – 1);

  • _DifColor ("Color", color) = (1., 1., 1., 1.) – diffuse light color (default – white);

Next, let’s move on to the sub-shader. In our case, he will be alone. First of all, we declare the shader tags –
Tags { "Queue"="Geometry" "LightMode"="ForwardBase" }
Tags are sets of key-value data pairs for defining any shader properties. You can use both predefined tags and add your own. Tags can be accessed from C # code. In our case, we will use several tags:

  • "Queue"="Geometry" – built-in tag, defines the Geometry rendering queue;

  • "LightMode"="ForwardBase" – built-in tag, defines the lighting model, applies ambient, main directional light, vertex lights and lightmaps; You can read more about tags. here, here, and so here

Next, using the keyword CGPROGRAM we declare the shader section in language GLSL… We will use the same names as in the previous chapter, which means that #pragma vertex vert declares as a vertex shader a function named vert, a #pragma fragment frag as a fragment shader a function named frag… Line #include "UnityCG.cginc" we include a library with auxiliary and most useful functions. Who wants to know more about this library, you can read about it right here

Moving further through the code, we define the data structure for the fragment shader:

struct v2f {
      float4 pos : SV_POSITION;
      float2 uv : TEXCOORD0;
      fixed4 light : COLOR0;
};

The name v2f is a common name, used both in examples from Unity and in many books and articles. Stands for Vector to fragment… In this structure, we define three variables that we will need:

  • float4 pos : SV_POSITION is the position of the vertex in space, SV_POSITION is the semantics of the variable. Specifying semantics for inputs and outputs is standard practice when writing shaders. More on this in the documentation Unity;

  • float2 uv : TEXCOORD0 – texture coordinates. TEXCOORD0 – means that the shader needs to use the first set of coordinates (there can be several of them, for example, lightmaps are usually written in uv1);

  • fixed4 light : COLOR0 – calculated lighting. COLOR0 – the corresponding slot from which the value should be taken;

Let’s start with the analysis of the vertex shader. Consider the signature of a vertex shader: v2f vert(appdata_base v)… As you can see, the return type v2f is the structure defined earlier. appdata_base v – a predefined type of input data for a vertex shader. Provides vertex coordinates, a normal, and one set of texture coordinates. In addition to appdata_base There are also appdata_tan and appdata_full

  • appdata_tan – Provides vertex coordinates, tangent, normal, and one set of texture coordinates;

  • appdata_full – Provides vertex coordinates, tangent, normal, four sets of texture coordinates, and a color.

For more details, see documentation… Next, you should declare a block of variables that will be used in the shader. In our case, these are:

fixed4 _LightColor0;
fixed _Diffuse;
fixed4 _DifColor;
fixed _Ambient;
fixed4 _AmbColor;

As mentioned earlier, the variables must have the same names as in the block. Properties… We should also pay attention to the variable fixed4 _LightColor0… Since we have defined the shader tag "LightMode"="ForwardBase", then some built-in lighting variables became available to us. _LightColor0 – just one of them. Let’s start with the analysis of the vertex shader. First of all, transform the point from object space into camera clipping space.
o.pos = UnityObjectToClipPos(v.vertex);… Next, you need to calculate the light intensity, as described in the theoretical block.

To calculate, we will need to calculate the following data:

  • unit vector illumination direction;

  • normal in world coordinates;

  • position in world coordinates;

  • camera direction.

We get the corresponding calculations:

// Light direction
float3 lightDir = normalize(_WorldSpaceLightPos0.xyz);

// Normal in WorldSpace
float3 worldNormal = UnityObjectToWorldNormal(v.normal.xyz);
 
// World position
float4 worldPos = mul(unity_ObjectToWorld, v.vertex);

// Camera direction
float3 viewDir = normalize(UnityWorldSpaceViewDir(worldPos.xyz));

Variable _WorldSpaceLightPos0 – built-in variable available at initialization LightMode value ForwardBase or ForwardAdd

Function UnityObjectToWorldNormal is a built-in function that converts the normal from object coordinates to world coordinates.

Function mul – built-in function in GLSL and HLSL languages. The signature is mul (matrixA, matrixB). The result of the function is the multiplication of matrix A by matrix B.

To calculate the world coordinates, we multiply the current matrix of the model by the coordinates of the vertex. unity_ObjectToWorld – built-in variable, returns the current model matrix.

To calculate the direction of the camera, we use the built-in function UnityWorldSpaceViewDir… The input parameter is the coordinates of the object. We normalize the result of the function using the standard GLSL language – the function normalize

So, all the data necessary for the calculation have been obtained, we proceed to the calculation of the lighting. First of all, let’s calculate the ambient lighting:

fixed4 amb = _Ambient * _AmbColor;

Here we multiply the light intensity and the color to calculate the ambient light. Next, we calculate the diffuse lighting:

fixed4 lightTemp = max(0., dot(worldNormal, lightDir) * _LightColor0);
fixed4 diffuse = lightTemp * _Diffuse * _LightColor0 * _DifColor;

Here we calculate diffuse lighting using the formulas that were described above in theory. Next, we just need to add the ambient and diffuse color:

o.light = dif + amb;

There is. Lighting calculation is over. For the final formation of the output data structure, we just need to pass uv coordinates.

o.uv = v.texcoord;

The vertex shader is finished. Now let’s move on to the fragment shader. In a fragment shader, we won’t have a huge amount of computation. All that is left for us is to take a piece of texture according to uv coordinates and apply lighting to it. The fragment shader code looks like this:

fixed4 frag(v2f i) : SV_Target
{
    fixed4 c = tex2D(_MainTex, i.uv);
    c.rgb *= i.light;
    return c;
}

There is. The shader is written. It remains to test it in Unity.

Hike something broke, I can not insert a gif, you can see it here

As you can see, the shader works correctly, the intensity and color can be changed through the shader variables.
Let’s also check the work of the shader with lighting:

Hike something broke, I can not insert a gif, you can see it here

As you can see, everything works correctly, the illumination of the object changes depending on the change in the parameters of the light source.

That’s all for now. Thank you for the attention. I know that articles are not published often, but I will try to improve and write more. I read all your comments, I am glad for objective criticism and comments.

Alexey Chernov

Team Lead at Program-Ace

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *