Now that we got that done, let's prepare our app to also be able to view regular flat photos. We'll do this by rendering them onto a plane. So first we need to define a Plane
component.
The
Plane
component rightfully belongs to the RenderBox
library, but for the time being, we'll add it directly to the app.
Create a new Java class file in the RenderBoxExt/components/
folder, and name it Plane
. Define it as extends RenderObject
, as follows:
public class Plane extends RenderObject { }
As with other geometry in the RenderBox
library, we'll define the plane with triangles. Simply two adjacent triangles are required, a total of six indices. The following data arrays define our default plane's 3D coordinates, UV texture coordinates, vertex colors (middle gray), normal vectors, and corresponding indices. Add the following code at the top of the class:
public static final float[] COORDS = new float[] { -1.0f, 1.0f, 0.0f, 1.0f, 1.0f, 0.0f, -1.0f, -1.0f, 0.0f, 1.0f, -1.0f, 0.0f }; public static final float[] TEX_COORDS = new float[] { 0.0f, 1.0f, 1.0f, 1.0f, 0f, 0f, 1.0f, 0f, }; public static final float[] COLORS = new float[] { 0.5f, 0.5f, 0.5f, 1.0f, 0.5f, 0.5f, 0.5f, 1.0f, 0.5f, 0.5f, 0.5f, 1.0f, 0.5f, 0.5f, 0.5f, 1.0f }; public static final float[] NORMALS = new float[] { 0.0f, 0.0f, -1.0f, 0.0f, 0.0f, -1.0f, 0.0f, 0.0f, -1.0f, 0.0f, 0.0f, -1.0f }; public static final short[] INDICES = new short[] { 0, 1, 2, 1, 3, 2 };
Now, we can define the Plane
constructor that calls an allocateBuffers
helper method that allocates buffers for vertices, normals, textures, and indexes. Let's declare variables for these at the top of the class, and write the methods:
public static FloatBuffer vertexBuffer; public static FloatBuffer colorBuffer; public static FloatBuffer normalBuffer; public static FloatBuffer texCoordBuffer; public static ShortBuffer indexBuffer; public static final int numIndices = 6; public Plane(){ super(); allocateBuffers(); } public static void allocateBuffers(){ //Already allocated? if (vertexBuffer != null) return; vertexBuffer = allocateFloatBuffer(COORDS); texCoordBuffer = allocateFloatBuffer(TEX_COORDS); colorBuffer = allocateFloatBuffer(COLORS); normalBuffer = allocateFloatBuffer(NORMALS); indexBuffer = allocateShortBuffer(INDICES); }
Again, we ensure that allocateBuffers
is run only once by checking whether vertexBuffer
is null. (Note that we've decided to declare the buffers public
to afford future flexibility to create arbitrary texture materials for objects.)
Next, we can add an appropriate material to the Plane
, one that uses a texture image. Using a constructor API pattern that is consistent with the built-in Sphere
component in Chapter 6, Solar System, we'll add the ability to call a new Plane
with an image texture ID and an optional lighting Boolean flag. Then, we'll add helper methods to allocate the corresponding Material
objects and set their buffers:
public Plane(int textureId, boolean lighting) { super(); allocateBuffers(); if (lighting) { createDiffuseMaterial(textureId); } else { createUnlitTexMaterial(textureId); } } public Plane createDiffuseMaterial(int textureId) { DiffuseLightingMaterial mat = new DiffuseLightingMaterial(textureId); mat.setBuffers(vertexBuffer, normalBuffer, texCoordBuffer, indexBuffer, numIndices); material = mat; return this; } public Plane createUnlitTexMaterial(int textureId) { UnlitTexMaterial mat = new UnlitTexMaterial(textureId); mat.setBuffers(vertexBuffer, texCoordBuffer, indexBuffer, numIndices); material = mat; return this; }
We can now add an image to the scene in MainActivity
. Soon we will take a look at the phone's photos folder for pictures, but at this point, you can just use the same (photosphere) one that we used earlier (or drop another in your res/drawable
folder). Note that you might have issues displaying an image that is too large for a phone's GPU. We will take a look at this issue later, so try to keep it less than 4,096 pixels in either dimension.
Name the object screen
because later on, we'll use it to project whichever photo the user selects from a gallery.
In MainActivity.java
, update the setup
function to add the image to the scene, as follows:
Plane screen; public void setup() { setupBackground(); setupScreen(); } void setupScreen() { screen = new Plane(R.drawable.sample360, false); new Transform() .setLocalScale(4, 4, 1) .setLocalPosition(0, 0, -5) .setLocalRotation(0, 0, 180) .addComponent(screen); }
The screen is scaled to 4 units (in X and Y) and placed 5 units in front of the camera. That's like sitting 5 meters (15 feet) from an 8 meter wide movie screen!
Also, note that we rotate the plane 180 degrees on the z axis; otherwise, the image will appear upside down. Our world coordinate system has the up-direction along the positive y axis. However, UV space (for rendering textures) typically has the origin in the upper-left corner and positive is downward. (If you remember, in the previous chapter, this is why we also had to flip the Earth). Later in this chapter, when we implement an Image
class, we'll read the actual orientation from the image file and set the rotation accordingly. Here's our screen plane with the image (viewed from an angle):
It will be convenient to separate the screen plane (with its image texture) from the placement and size of the screen. We will see why this is important later, but it has to do with scaling and rotating based on image parameters. Let's refactor the code so that the screen is parented by a screenRoot
transform as follows:
void setupScreen() { Transform screenRoot = new Transform() .setLocalScale(4, 4, 1) .setLocalRotation(0, 0, 180) .setLocalPosition(0, 0, -5); screen = new Plane(R.drawable.sample360, false); new Transform() .setParent(screenRoot, false) .addComponent(screen); }