I have only two rules which I regard as principles of conduct.
The first is: Have no rules.
The second is: Be independent of the opinion of others.
-Albert Einstein
In everything, do to others what you would want them to do to you.
-Jesus
We need to be the change we wish to see in the world.
-Mahatma Gandhi
If I have seen further it is only by standing on the shoulders of giants.
-Isaac Newton
It is not enough to have a good mind.
The main thing is to use it well.
-Descartes
We do now know all the basics that is needed to create a virtual/augmente reality application. To only display blocks in different colors will however not be very interesting in the long run. In this lesson we will therefore learn how to implement custom Renderables. The Renderable we produce in this lesson will be a simple surface, but we will see how such a surface can be implemented from bottom up, and at the same time we will add a texture to it. In this way we will also be able to see how we use the VRInitCallback from the previous lesson to create the texture.
For the complete file produced in this lesson, click
here.
For the complete application produced in this lesson, click
here.
Implementing a custom Renderable
The only thing that a class needs to implement to be a Renderable, is the method
public void render(GL10 gl);
We do therefore create a new class called TexturedSurface, and insert the following content into it.
package com.dafer45.virtualreality;
import java.io.IOException;
import java.io.InputStream;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import javax.microedition.khronos.opengles.GL10;
import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.opengl.GLUtils;
import com.dafer45.virtualreality.renderable.Renderable;
import com.dafer45.utilities.MathVector;
public class TexturedSurface implements Renderable{
MathVector position;
private FloatBuffer vertexBuffer;
private FloatBuffer textureBuffer;
private ByteBuffer indexBuffer;
private static int[] textures = new int[1];
private float vertices[] = {-5f, 0.0f, -5f,
5f, 0.0f, -5f,
-5f, 0.0f, 5f,
5f, 0.0f, 5f};
private float texture[] = {
0.0f, 1.0f,
1.0f, 1.0f,
0.0f, 0.0f,
1.0f, 0.0f
};
private byte indices[] = {
0, 1, 3, 0, 3, 2
};
public TexturedSurface(){
position = new MathVector(0, 0, 0);
ByteBuffer byteBuf =
ByteBuffer.allocateDirect(vertices.length*4);
byteBuf.order(ByteOrder.nativeOrder());
vertexBuffer = byteBuf.asFloatBuffer();
vertexBuffer.put(vertices);
vertexBuffer.position(0);
byteBuf = ByteBuffer.allocateDirect(texture.length*4);
byteBuf.order(ByteOrder.nativeOrder());
textureBuffer = byteBuf.asFloatBuffer();
textureBuffer.put(texture);
textureBuffer.position(0);
indexBuffer = ByteBuffer.allocateDirect(indices.length);
indexBuffer.put(indices);
indexBuffer.position(0);
}
public static void loadTexture(GL10 gl, Context context){
InputStream is =
context.getResources().openRawResource(R.drawable.texture);
Bitmap bitmap = null;
try{
bitmap = BitmapFactory.decodeStream(is);
}
finally{
try{
is.close();
is = null;
}
catch(IOException e){}
}
gl.glGenTextures(1, textures, 0);
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);
gl.glTexParameterf(GL10.GL_TEXTURE_2D,
GL10.GL_TEXTURE_MIN_FILTER,
GL10.GL_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D,
GL10.GL_TEXTURE_MAG_FILTER,
GL10.GL_LINEAR);
gl.glTexParameterf(GL10.GL_TEXTURE_2D,
GL10.GL_TEXTURE_WRAP_S,
GL10.GL_REPEAT);
gl.glTexParameterf(GL10.GL_TEXTURE_2D,
GL10.GL_TEXTURE_WRAP_T,
GL10.GL_REPEAT);
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bitmap, 0);
bitmap.recycle();
}
public void render(GL10 gl){
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);
gl.glFrontFace(GL10.GL_CCW);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer);
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer);
gl.glTranslatef(position.x, position.y, position.z);
gl.glDrawElements(GL10.GL_TRIANGLES, indices.length,
GL10.GL_UNSIGNED_BYTE, indexBuffer);
gl.glTranslatef(-position.x, -position.y, -position.z);
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
}
public void setPosition(MathVector position){
this.position = position;
}
}
It is assumed that you are familiar with openGL and the code will therefore only be briefly described here. The code does however build pretty much on lesson 6 in the
NeHe Android Ports at INsanityDesign, which you should see if anything is unclear with the openGL related calls.
A couple of things that are related to the virtual reality package is however needed here. First of all we implement the Renderable interface by defining the function render-method. Note that the object translates and render itself. When implementing the render-method, you are responsible for positioning and orienting the object relative to the coordinate system of the scene. It is important to note that all positioning and orinetation shall be relative to the coordinate system and not the camera. The camera position and orientation is as we have seen handled by the location and orientation handlers. It is also your responsibility to apply textures and render the object in the render-method.
The second thing that is related to our use of the virtual reality package is the loadTexture-method. The Renderable is not at all required to have this method. But in order to be able to load the texture at the initialization of our application, we will call this method from the VRInitCallback that we learned to use in the previous lesson.
Finally, a third note. In the setPosition-method we have choosen to use the MathVector class that was introduced in the second lesson. This is how positons are handled in the rest of the package, so it is better to get used to it even when we implement our own extensions.
Loading the texture
Now we return to the VRFirst class and the VRInitCallback class. To load the texture and enable it to be displayed, make sure that the content of the init-method is the following. (Note that we also changed the color back to white)
gl.glColor4f(1.0f, 1.0f, 1.0f, 1.0f);
TexturedSurface.loadTexture(gl, VRFirst.this);
gl.glEnable(GL10.GL_TEXTURE_2D);
gl.glEnable(GL10.GL_DEPTH_TEST);
gl.glDepthFunc(GL10.GL_LEQUAL);
Finally, in the onCreate-method, we create and add our object to the scene with by adding the following lines.
TexturedSurface texturedSurface = new TexturedSurface();
texturedSurface.setPosition(new MathVector(0, 20, 0));
basicScene.addRenderable(texturedSurface);
The actual texture
To be able to compile the project, we now only have to copy
this file to our projects "res/drawable"-folder. When the application is run, a large textured surface should now appear 20 meters to the north.