Illumination in OpenGL

Posted by rokkstar on Sat, 15 Jun 2019 06:34:06 +0200

Illumination in Android OpenGLES 2.0

Translated from Android Course II: Ambient Light and Diffuse Reflective Light

1. What is light?

The world will see nothing without light, and we will not be able to perceive the world's living objects.

In the real world, light is formed by the aggregation of countless photons. It starts from a light source, passes through countless times, and finally reaches our eyes.

How should we use computer images to simulate light? There are two popular technologies available: ray tracing and rasterization. Ray tracing can give very accurate and real results by tracking real light through mathematical calculation, but the disadvantage is that simulating all light consumes computational resources and the real-time rendering speed is very slow. Because of this limitation, most real-time computer images use rasterization technology to simulate light by approaching the results.

2. Classification of Light

We can abstract the way light works and form three basic types of light.

  • Ambient light

Ambient light is a basic kind of light, it is all over the scene, it does not show from any other source of light, because it passes through numerous reflections before it reaches you. This kind of light can be overcast outdoors or indoors where many different light sources accumulate effects. Without calculating all the individual lights separately, we can set a basic level of light for the object or scene.

  • Diffuse reflection light

This light reflects between objects when it reaches your eyes and teaches you. The illumination level of an object varies with its angle to light. It's brighter when you face it directly. Similarly, when we perceive objects, no matter where we are relative to them, the highlights are the same. This phenomenon is also called Lambert's cosine law. Diffuse reflection and Lambert reflection are very common in life.

  • Highlight

Unlike diffuse reflection, highlights vary according to where we are and where we are. They make objects brighter and smoother.

2. Simulated light

As with three kinds of light in 3D scene, there are three kinds of light sources: direct light source, Point light source and Spot light source.

1. Mathematics

Learning ambient and diffuse light from a point source

Ambient light

Ambient light is actually a kind of diffuse reflection light, but it can also be regarded as low-level light full of the whole scene. In this way, it will be easy to calculate.

final color = material color * ambient light color

For example, there is an object that is red and our ambient light is gray. Let's assume that the color is stored as an array of three colors, red, green and blue, using the RGB color model:

final color = { 1 , 0 , 0 } * { 0.1 , 0.1 , 0.1 } = { 0.1 , 0.0 , 0.0 }

Eventually, the color of the object will be light red. That's the basic scenario light, unless you want more advanced visual technology.

Diffuse Reflected Light-Point Light Source

For diffuse light, we need to increase attenuation and position of light. The position of light is used to calculate the angle between light and surface, which affects the overall level of light on the surface. It is also used to calculate the distance between the light and the surface and to determine the intensity of the light at that point.

Step 1: Calculate Lambert Factor

The first major calculation we need is to calculate the angle of indication and light. The surface of straight light will be at the maximum intensity of light. The proper way to calculate this attribute is to use Lambert's cosine theorem. If we have two vectors, one is a point from the light to the surface, and the second is the surface normal, we can calculate the cosine value: first normalize the vectors, so that their length is 1, and then calculate the dot product of the two vectors. This operation can be easily accomplished through two shaders of OpenGL ES.

We declare the lambert factor, which ranges from 0 to 1.

1.

light vector = light position - obejct position
cosine = dot product(object normal,normalize(light vector))
lambert factor = max(cosine,0)

First we subtract the position of light from the position of the object to calculate the vector of light. Then we get the point product of the normal line and the ray vector of the object, and we get the cosine. Normalized ray vector means to change its length to 1. Because the dot product range is - 1 to 1, we fix it to (0, 1).

Here's an example: there's a smooth plane, with the normal line of the surface pointing straight to the sky. The position of light is "0, 10, - 10", or 10 units up and 10 units forward. We need to calculate the light in place.

light vector = { 0, 10, -10} - {0, 0, 0} = {0, 10, -10}
obejct normal = {0, 1, 0}

In plain language, if we move along the vector of light, we reach the position of light. To normalize this vector, let each scalar of the vector be the length of the vector:

light vector length = square root( 0*0 + 10*10 + -10*-10) = square root(200) = 14.14
normalized light vector = {0/14.14, 10/14.14, -10/14.14} = {0, 0.707, -0.707}

Then we calculate the dot product.

dot product({0,1,0},{0,0.707, -0.707}) = 0*0 + 1*0.707 + 0*-0.707 = 0.707
lambert factor = max(0.707,0) = 0.707

OpenGL ES 2's Shader Language already has some of these functions built in, so we don't need to do all the math by hand, but it's still helpful for our understanding.

Step 2: Calculate the attenuation factor

Next, we need to calculate the attenuation, and the attenuation of the real point source follows. Law of Inverse Square Ratio It can be expressed as:
luminosity = 1 / (distance* distance)
Back to our example, we know a distance of 14.14, so at the end of the day our brightness looks like:
luminosity = 1/ (14.14*14.14) = 0.005
As you can see, the law of inverse square ratio leads to a sharp decay in distance. This is how light starts from a point source in the real world, but our image display is limited, so by suppressing the attenuation factor, we can get more real light without making the object look too dark.

Step 3: Calculate the final color

Now that we have both cosine and attenuation factors, we can calculate the final illumination level:

final color = meterial color * (light color * lambert factor * luminosity)

Back to the previous example, we have red material and all white light sources, and the final calculation is:

final color = {1, 0, 0} *({1,1,1}*0.707*0.005) 
= {1,0,0}*{0.0035,0.0035,0.0035} 
= {0.0035,0,0}

To sum up simply, we need to use the angle of surface and light, and the distance between surface and light to calculate the whole diffuse reflectance lighting level. Following are the steps:

//one
light vector = light position - object position
cosine = dot product(object normal,normalize(light vector))
lambert factor = max(cosine,0)

//two
luminosity = 1/(distance*distance)

//three
final color = material color * (light color*lambert factor * luminosity)

Vertex Shader

final String vertexShader =
    "uniform mat4 u_MVPMatrix;      \n"     // A matrix representing the transformation of Model, View and Projection
  + "uniform mat4 u_MVMatrix;       \n"     // A matrix representing the transformation of Model and View
  + "uniform vec3 u_LightPos;       \n"     // The Position of Light in Eye Coordinate System

  + "attribute vec4 a_Position;     \n"     // Input vertex location information
  + "attribute vec4 a_Color;        \n"     // The color information of each vertex passed on
  + "attribute vec3 a_Normal;       \n"     // Normal information for each vertex

  + "varying vec4 v_Color;          \n"     // Color information, which is passed into the fragment shader

  + "void main()                    \n"     
  + "{                              \n"
// Converting vertices into eye coordinates
  + "   vec3 modelViewVertex = vec3(u_MVMatrix * a_Position);              \n"
// Transfer the normal direction into the eye coordinate system
  + "   vec3 modelViewNormal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));     \n"
// Calculate the distance between the vertex and the light  
  + "   float distance = length(u_LightPos - modelViewVertex);             \n"
// Get the ray vector from the light point to the vertex
  + "   vec3 lightVector = normalize(u_LightPos - modelViewVertex);        \n"
// Calculate lambert factor, which is the point product of the ray vector and the vertex normal. If the normal vector and the ray vector point in the same direction, the maximum light intensity will be obtained.
  + "   float lambert = max(dot(modelViewNormal, lightVector), 0.1);       \n"
// The attenuation of light with distance
  + "   float diffuse = lambert * (1.0 / (1.0 + (0.25 * distance * distance)));  \n"
// Multiply the color by the attenuation and It will be interpolated across the triangle.
  + "   v_Color = a_Color * diffuse;                                       \n"
// gl_Position stores the final location.
// Multiply this vector by the transformation matrix to get the points in the normalized screen coordinate system
  + "   gl_Position = u_MVPMatrix * a_Position;                            \n"
  + "}                                                                     \n";

Fragment Shader

final String fragmentShader =
  "precision mediump float;       \n"     // Set the default precision to medium. We don't need as high of a
                                          // precision in the fragment shader.
+ "varying vec4 v_Color;          \n"     // This is the color from the vertex shader interpolated across the
                                          // triangle per fragment.
+ "void main()                    \n"     // The entry point for our fragment shader.
+ "{                              \n"
+ "   gl_FragColor = v_Color;     \n"     // Pass the color directly through the pipeline.
+ "}                              \n";

Vertex and Fragment Shader of Light Source

// Define a simple shader program for our point.
final String pointVertexShader =
    "uniform mat4 u_MVPMatrix;      \n"
  + "attribute vec4 a_Position;     \n"
  + "void main()                    \n"
  + "{                              \n"
  + "   gl_Position = u_MVPMatrix   \n"
  + "               * a_Position;   \n"
  + "   gl_PointSize = 5.0;         \n"
  + "}                              \n";

final String pointFragmentShader =
    "precision mediump float;       \n"
  + "void main()                    \n"
  + "{                              \n"
  + "   gl_FragColor = vec4(1.0,    \n" //Directly specify the color of the fragment to be white
  + "   1.0, 1.0, 1.0);             \n"
  + "}                              \n";

A new property called gl_PointSize is how many pixels a point is in space.

Topics: Fragment Attribute Android