The second visualization will also be a basic oscilloscope-type display of waveform data. However, previously, we used audio data to scale 3D slice cubes; this time, we'll render them all on a 2D plane using a shader that uses audio data as input.
Our RenderBox
library allows us to define new materials and shaders. In the previous projects, we built materials that use bitmap images for texture mapping onto the geometry as it's rendered. In this project, we'll paint the quad using the audio bytes array, using the byte value to control the position where we set a brighter color. (Note that the Plane
class was added to RenderBox
lib in Chapter 7, 360-Degree Gallery.)
First, let's generate a texture structure to hold our texture data. In the VisualizerBox
class, add the following method to set up the texture in GLES. We can't use our normal texture pipeline, since it is designed to allocate a texture directly from image data. Our data is one-dimensional, so it may seem odd to use a Texture2D
resource, but we'll set the height to one pixel:
public static int genTexture(){ final int[] textureHandle = new int[1]; GLES20.glGenTextures(1, textureHandle, 0); RenderBox.checkGLError("VisualizerBox GenTexture"); if (textureHandle[0] != 0) { // Bind to the texture in OpenGL GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]); // Set filtering GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST); GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST); } if (textureHandle[0] == 0){ throw new RuntimeException("Error loading texture."); } return textureHandle[0]; }
Then add the call to setup
, including a static variable to hold the generated texture handle:
public static int audioTexture = -1; public void setup() { audioTexture = genTexture(); if(activeViz != null) activeViz.setup(); }
Now we can populate the texture from audio byte data. In the Android Visualizer
listener, add a call to loadTexture
in the onWaveFormDataCapture
method:
public void onWaveFormDataCapture(Visualizer visualizer, byte[] bytes, int samplingRate){
audioBytes = bytes;
loadTexture(cardboardView, audioTexture, bytes);
}
Let's define loadTexture
as follows. It copies the audio bytes into a new array buffer and hands it off to OpenGL ES with the glBindTexture
and glTexImage2D
calls.
(Refer to http://stackoverflow.com/questions/14290096/how-to-create-a-opengl-texture-from-byte-array-in-android.):
public static void loadTexture(CardboardView cardboardView, final int textureId, byte[] bytes){ if(textureId < 0) return; final ByteBuffer buffer = ByteBuffer.allocateDirect(bytes.length * 4); final int length = bytes.length; buffer.order(ByteOrder.nativeOrder()); buffer.put(bytes); buffer.position(0); cardboardView.queueEvent(new Runnable() { @Override public void run() { GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId); GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE, length, 1, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, buffer); } }); }
Now it's time to write the shader programs that, among other things, will dictate the parameters and attributes that need to be set in the Material
class.
If necessary, create a resources directory for the shaders, res/raw/
. Then, create the waveform_vertex.shader
and waveform_fragment.shader
files. Define them as follows.
The waveform_vertex.shader
file is identical to the unlit_tex_vertex
shader we were using. Strictly speaking, we can just reuse this file and specify its resource in the createProgram
function, but it is good practice to define individual shader files unless you are explicitly following some sort of a pattern where you are using a number of variants on a given shader.
File: res/raw/waveform_vertex.shader
:
uniform mat4 u_MVP; attribute vec4 a_Position; attribute vec2 a_TexCoordinate; varying vec2 v_TexCoordinate; void main() { // pass through the texture coordinate v_TexCoordinate = a_TexCoordinate; // final point in normalized screen coordinates gl_Position = u_MVP * a_Position; }
For the waveform_fragment
shader, we add variables for a solid color (u_Color
) and threshold width (u_Width
). And then, add a bit of logic to decide whether the y coordinate of the current pixel being rendered is within u_Width
of the sample.
File: res/raw/waveform_fragment.shader
precision mediump float; // default medium precision uniform sampler2D u_Texture; // the input texture varying vec2 v_TexCoordinate; // interpolated texture coordinate per fragment uniform vec4 u_Color; uniform float u_Width; // The entry point for our fragment shader. void main() { vec4 color; float dist = abs(v_TexCoordinate.y - texture2D(u_Texture, v_TexCoordinate).r); if(dist < u_Width){ color = u_Color; } gl_FragColor = color; }
Now we define the Material
class for the shaders. Create a new Java class named WaveformMaterial
and define it as follows:
public class WaveformMaterial extends Material { private static final String TAG = "WaveformMaterial"; }
Add material variables for the texture ID, border, width, and color. Then, add variables for the shader program reference and buffers, as shown in the following code:
static int program = -1; //Initialize to a totally invalid value for setup state static int positionParam; static int texCoordParam; static int textureParam; static int MVPParam; static int colorParam; static int widthParam; public float borderWidth = 0.01f; public float[] borderColor = new float[]{0.6549f, 0.8392f, 1f, 1f}; FloatBuffer vertexBuffer; FloatBuffer texCoordBuffer; ShortBuffer indexBuffer; int numIndices;
Now we can add a constructor. As we saw earlier, it calls a setupProgram
helper method that creates the shader program and obtains references to its parameters:
public WaveformMaterial() { super(); setupProgram(); } public static void setupProgram() { if(program > -1) return; //Create shader program program = createProgram( R.raw.waveform_vertex, R.raw.waveform_fragment ); RenderBox.checkGLError("Bitmap GenTexture"); //Get vertex attribute parameters positionParam = GLES20.glGetAttribLocation(program, "a_Position"); RenderBox.checkGLError("Bitmap GenTexture"); texCoordParam = GLES20.glGetAttribLocation(program, "a_TexCoordinate"); RenderBox.checkGLError("Bitmap GenTexture"); //Enable them (turns out this is kind of a big deal ;) GLES20.glEnableVertexAttribArray(positionParam); RenderBox.checkGLError("Bitmap GenTexture"); GLES20.glEnableVertexAttribArray(texCoordParam); RenderBox.checkGLError("Bitmap GenTexture"); //Shader-specific parameters textureParam = GLES20.glGetUniformLocation(program, "u_Texture"); MVPParam = GLES20.glGetUniformLocation(program, "u_MVP"); colorParam = GLES20.glGetUniformLocation(program, "u_Color"); widthParam = GLES20.glGetUniformLocation(program, "u_Width"); RenderBox.checkGLError("Waveform params"); }
Likewise, we add a setBuffers
method to be called by the RenderObject
component (Plane
):
public WaveformMaterial setBuffers(FloatBuffer vertexBuffer, FloatBuffer texCoordBuffer, ShortBuffer indexBuffer, int numIndices) { //Associate VBO data with this instance of the material this.vertexBuffer = vertexBuffer; this.texCoordBuffer = texCoordBuffer; this.indexBuffer = indexBuffer; this.numIndices = numIndices; return this; }
Add the draw
code, which will be called from the Camera
component, to render the geometry prepared in the buffers (via setBuffers
). The draw
method looks like this:
@Override public void draw(float[] view, float[] perspective) { GLES20.glUseProgram(program); // Set the active texture unit to texture unit 0. GLES20.glActiveTexture(GLES20.GL_TEXTURE0); // Bind the texture to this unit. GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, VisualizerBox.audioTexture); // Tell the texture uniform sampler to use this texture in //the shader by binding to texture unit 0. GLES20.glUniform1i(textureParam, 0); Matrix.multiplyMM(modelView, 0, view, 0, RenderObject.model, 0); Matrix.multiplyMM(modelViewProjection, 0, perspective, 0, modelView, 0); // Set the ModelViewProjection matrix for eye position. GLES20.glUniformMatrix4fv(MVPParam, 1, false, modelViewProjection, 0); GLES20.glUniform4fv(colorParam, 1, borderColor, 0); GLES20.glUniform1f(widthParam, borderWidth); //Set vertex attributes GLES20.glVertexAttribPointer(positionParam, 3, GLES20.GL_FLOAT, false, 0, vertexBuffer); GLES20.glVertexAttribPointer(texCoordParam, 2, GLES20.GL_FLOAT, false, 0, texCoordBuffer); GLES20.glDrawElements(GLES20.GL_TRIANGLES, numIndices, GLES20.GL_UNSIGNED_SHORT, indexBuffer); RenderBox.checkGLError("WaveformMaterial draw"); }
One more thing; let's provide a method to destroy an existing material:
public static void destroy(){ program = -1; }
Now we can create a new visualization object. Under the visualizations/
folder, create a new Java class named WaveformVisualization
and define it as extends Visualization
:
public class WaveformVisualization extends Visualization { static final String TAG = "WaveformVisualization"; public WaveformVisualization(VisualizerBox visualizerBox) { super(visualizerBox); } @Override public void setup() { } @Override public void preDraw() { } @Override public void postDraw() { } }
Declare a variable for the Plane
component we will create:
RenderObject plane;
Create it in the setup
method as follows. Set the material to a new WaveformMaterial
, and position it over towards the left:
public void setup() { plane = new Plane().setMaterial(new WaveformMaterial() .setBuffers(Plane.vertexBuffer, Plane.texCoordBuffer, Plane.indexBuffer, Plane.numIndices)); new Transform() .setLocalPosition(-5, 0, 0) .setLocalRotation(0, 90, 0) .addComponent(plane); }
Now in onCreate
of MainActivity
, replace the previous visualization with this one:
visualizerBox.activeViz = new WaveformVisualization(visualizerBox);
When you run the project, you get a visualization like this: