We are going to start by rendering our sphere in a solid color but with lighted shading. As usual, we start by writing the shader functions that, among other things, define the program variables they will need from the Material
that uses it. Then, we'll define the SolidColorLightingMaterial
class and add it to the Sphere
component.
In the previous chapters, where we used shaders with lighting, we did the lighting calculations in the vertex shader. That's simpler (and faster), but transitioning the calculations to the fragment shader yields better results. The reason is that, in the vertex shader, you only have one normal value to compare against the light direction. In the fragment, all vertex attributes are interpolated, meaning that the normal value at a given point between two vertices will be some point in between their two normals. When this is the case, you see a smooth gradient across the triangle face, rather than localized shading artifacts around each vertex. We will be creating a new Material
class to implement lighting in the fragment shader.
If necessary, create an Android Resource Directory for the shaders (resource type: raw
), res/raw/
. Then, create the solid_color_lighting_vertex.shader
and res/raw/solid_color_lighting_fragment.shader
files and define them as follows.
File: res/raw/solid_color_lighting_vertex.shader
uniform mat4 u_MVP; uniform mat4 u_MV; attribute vec4 a_Position; attribute vec3 a_Normal; varying vec3 v_Position; varying vec3 v_Normal; void main() { // vertex in eye space v_Position = vec3(u_MV * a_Position); // normal's orientation in eye space v_Normal = vec3(u_MV * vec4(a_Normal, 0.0)); // point in normalized screen coordinates gl_Position = u_MVP * a_Position; }
Note that we have separate uniform variables for u_MV
and u_MVP
. Also, if you remember that in the previous chapter, we separated the lighting model from the actual model because we did not want scale to affect lighting calculations. Similarly, the projection matrix is only useful to apply the camera FOV to vertex positions and will interfere with lighting calculations.
File: res/raw/solid_color_lighting_fragment.shader
precision mediump float; // default medium precision in the fragment shader uniform vec3 u_LightPos; // light position in eye space uniform vec4 u_LightCol; uniform vec4 u_Color; varying vec3 v_Position; varying vec3 v_Normal; varying vec2 v_TexCoordinate; void main() { // distance for attenuation. float distance = length(u_LightPos - v_Position); // lighting direction vector from the light to the vertex vec3 lightVector = normalize(u_LightPos - v_Position); // dot product of the light vector and vertex normal. // If the normal and light vector are // pointing in the same direction then it will get max // illumination. float diffuse = max(dot(v_Normal, lightVector), 0.01); // Add a tiny bit of ambient lighting (this is outerspace) diffuse = diffuse + 0.025; // Multiply color by the diffuse illumination level and // texture value to get final output color gl_FragColor = u_Color * u_LightCol * diffuse; }
Next, we define the Material
class for the shaders. In the materials folder, create a new Java class named SolidColorLightingMaterial
and define it as follows:
public class SolidColorLightingMaterial extends Material { private static final String TAG = "solidcolorlighting"; }
Add the variables for color, program references, and buffers, as shown in the following code:
float[] color = new float[4]; static int program = -1; static int positionParam; static int colorParam; static int normalParam; static int modelParam; static int MVParam; static int MVPParam; static int lightPosParam; static int lightColParam; FloatBuffer vertexBuffer; FloatBuffer normalBuffer; ShortBuffer indexBuffer; int numIndices;
Now, we can add a constructor, which receives a color (RGBA) value and sets up the shader program, as follows:
public SolidColorLightingMaterial(float[] c){ super(); setColor(c); setupProgram(); } public void setColor(float[] c){ color = c; }
As we've seen earlier, the setupProgram
method creates the shader program and obtains references to its parameters:
public static void setupProgram(){ //Already setup? if (program != -1) return; //Create shader program program = createProgram(R.raw.solid_color_lighting_vertex, R.raw.solid_color_lighting_fragment); //Get vertex attribute parameters positionParam = GLES20.glGetAttribLocation(program, "a_Position"); normalParam = GLES20.glGetAttribLocation(program, "a_Normal"); //Enable them (turns out this is kind of a big deal ;) GLES20.glEnableVertexAttribArray(positionParam); GLES20.glEnableVertexAttribArray(normalParam); //Shader-specific parameters colorParam = GLES20.glGetUniformLocation(program, "u_Color"); MVParam = GLES20.glGetUniformLocation(program, "u_MV"); MVPParam = GLES20.glGetUniformLocation(program, "u_MVP"); lightPosParam = GLES20.glGetUniformLocation(program, "u_LightPos"); lightColParam = GLES20.glGetUniformLocation(program, "u_LightCol"); RenderBox.checkGLError("Solid Color Lighting params"); }
Likewise, we add a setBuffers
method that is called by the RenderObject
component (Sphere
):
public void setBuffers(FloatBuffer vertexBuffer, FloatBuffer normalBuffer, ShortBuffer indexBuffer, int numIndices){ this.vertexBuffer = vertexBuffer; this.normalBuffer = normalBuffer; this.indexBuffer = indexBuffer; this.numIndices = numIndices; }
Lastly, add the draw
code, which will be called from the Camera
component, to render the geometry prepared in the buffers (via setBuffers
). The draw
method looks like this:
@Override public void draw(float[] view, float[] perspective) { GLES20.glUseProgram(program); GLES20.glUniform3fv(lightPosParam, 1, RenderBox.instance.mainLight.lightPosInEyeSpace, 0); GLES20.glUniform4fv(lightColParam, 1, RenderBox.instance.mainLight.color, 0); Matrix.multiplyMM(modelView, 0, view, 0, RenderObject.lightingModel, 0); // Set the ModelView in the shader, // used to calculate lighting GLES20.glUniformMatrix4fv(MVParam, 1, false, modelView, 0); Matrix.multiplyMM(modelView, 0, view, 0, RenderObject.model, 0); Matrix.multiplyMM(modelViewProjection, 0, perspective, 0, modelView, 0); // Set the ModelViewProjection matrix for eye position. GLES20.glUniformMatrix4fv(MVPParam, 1, false, modelViewProjection, 0); GLES20.glUniform4fv(colorParam, 1, color, 0); //Set vertex attributes GLES20.glVertexAttribPointer(positionParam, 3, GLES20.GL_FLOAT, false, 0, vertexBuffer); GLES20.glVertexAttribPointer(normalParam, 3, GLES20.GL_FLOAT, false, 0, normalBuffer); GLES20.glDrawElements(GLES20.GL_TRIANGLES, numIndices, GLES20.GL_UNSIGNED_SHORT, indexBuffer); }
Now that we have a solid color lighting material and shaders, we can add them to the Sphere
class to be used in our project.
To use this Material
with the Sphere
, we'll define a new constructor (Sphere
) that calls a helper method (createSolidColorLightingMaterial
) to create the material and set the buffers. Here's the code:
public Sphere(float[] color) { super(); allocateBuffers(); createSolidColorLightingMaterial(color); } public Sphere createSolidColorLightingMaterial(float[] color){ SolidColorLightingMaterial mat = new SolidColorLightingMaterial(color); mat.setBuffers(vertexBuffer, normalBuffer, indexBuffer, numIndices); material = mat; return this; }
Okay, we can now add the sphere to our scene.
Let's see how this looks! We'll create a scene with a sphere, a light, and a camera. Remember that, fortunately, the RenderBox
class creates the default Camera
and Light
instances for us. We just need to add the Sphere
component.
Edit your MainActivity.java
file to add the sphere in setup
. We'll color it yellowish and position it at x, y, z location (2, -2, 5):
private Transform sphere; @Override public void setup() { sphere = new Transform(); float[] color = new float[]{1, 1, 0.5f, 1}; sphere.addComponent(new Sphere(color)); sphere.setLocalPosition(2.0f, -2.f, -5.0f); }
Here's what it should look like, a stereoscopic pair of golden globes:
If you see what I see, you deserve an award for that!