Basic Android OpenGL ES
To better grasp how AndEngine's rendering process works, you need to be familiar with how drawing meshes with OpenGL works. To this end, we'll run through a basic OpenGL ES 2.0 application on Android. Those who already have OpenGL ES 2.0 experience can likely skip this section. Others should read this to either learn about the basics of OpenGL or learn how Android's OpenGL API works.
For those unfamiliar with the differences between OpenGL and OpenGL ES, an easy way to understand it is that OpenGL ES (embedded systems) is a stripped-down version of OpenGL suitable for the more limited graphics processors on embedded systems, including gaming consoles and smartphones. While this doesn't really affect the basic rendering functionality, for advanced features, it is somewhat more limiting than the full OpenGL API. However, it will have no significant effect in the case of general game development on embedded systems, including our own efforts in this book.
Getting started, we need a few basic components in Android in order to render a mesh:
- Mesh data
GLSurfaceView
- A
GLSurfaceView.Renderer
implementation
What the mesh
data is should be fairly obvious by now. In the following example, we'll use the data that we'll be importing using the Assimp-based importer we wrote in the previous section.
GLsurfaceView
is the UI widget on which we'll be drawing our model.
GLSurfaceView.Renderer
is a Java interface for declaring the methods that we have to implement. These are onSurfaceCreated
, onSurfaceChanged
, and onDrawFrame
.
Now we create the following files in our basic OpenGL Android project:
MainActivity.java
Renderer.java
ModelData.java
Scene.java
Camera.java
Actor.java
Mesh.java
ShaderProgram.java
MainActivity
private GLSurfaceView mGLView; private Renderer renderer; private ModelData mModelData; private Scene mScene; private Camera mCamera; private Actor mActor; private ShaderProgram mShaderProgram; private Mesh mMesh; private AssetManager mAssetManager;
Most of these variables should be fairly obvious by now. GLSurfaceView
is our rendering surface, as explained earlier. Renderer
is our renderer implementation. Actor encapsulates the model's Mesh
and ShaderProgram
classes. The Scene
object holds our camera and models, and the ModelData
and AssetManager
instances are used with the Asset importer. The code for using it is as follows:
static { System.loadLibrary("assimpImporter"); } private native boolean getModelData(ModelData model, AssetManager manager, String filename);
The first part within the static block is for telling the system to load our native library. Next, we declare the details of the exported functions in the library as a native function, including only our own parameters. The JNI-specific parameters will be added automatically.
To load a model, we use the following code:
public void loadModel() { mAssetManager = getAssets(); mModelData = new ModelData(); if (!getModelData(mModelData, mAssetManager, "models/teapot.obj")) { mModelData = null; } }
Here, we will obtain a reference to the Asset Manager and create a new instance of our ModelData
class. Both are passed to the native function we declared earlier, as well as the name of the file containing the model we wish to load. If everything goes well, we can set up the rendering context.
We do this in the onCreate
function of our MainActivity
:
loadModel(); mGLView = new GLSurfaceView(this); final ActivityManager activityManager = (ActivityManager) getSystemService(Context.ACTIVITY_SERVICE); final ConfigurationInfo configurationInfo = activityManager.getDeviceConfigurationInfo(); final boolean supportsEs2 = configurationInfo.reqGlEsVersion >= 0x20000; if (supportsEs2 || Build.FINGERPRINT.startsWith("generic") ) { // Request an OpenGL ES 2.0 compatible context. mGLView.setEGLContextClientVersion(2); renderer = new Renderer(); mGLView.setRenderer(renderer); } else { // Create an OpenGL ES 1.x compatible renderer return; } setContentView(mGLView);
This is a fair bit of code to take in. Essentially, what we do here is call the loadModel
function, which we defined earlier. After that, we create a new GLSurfaceView
instance. Before setting this as the rendering surface, we want to make sure that the hardware we're running on actually supports OpenGL ES 2.0, as this is the version we're targeting.
For this, we require a ConfigurationInfo
reference that contains system information, which we obtain via ActivityManager
. This is somewhat convoluted, but it works. Then, we can check the reqGlEsVersion
property against the desired OpenGL version. We use a hexadecimal value for this as that's how it is defined. 0x20000 is the value that indicates OpenGL ES 2.0. We are fine with this or a higher version.
Depending on the result of this test, we can either create the OpenGL ES 2.0 context in GLSurfaceView
, or maybe fall back to a compatibility mode using an OpenGL ES 1.1 renderer. We would just abort execution in that case.
The second test in the if
statement here compares the build fingerprint associated with the Android system we're running on. This is a string that contains some generic information, which tells us what type of Android system it is, who built it, and the like. We test for the appearance of the string generic in this, as the Android images used with the SDK emulator start with it. The fact that we have to test whether we're running on the emulator is due to a flaw in the SDK emulator, which doesn't fill in the ConfigurationInfo
properties, leaving the reqGlEsVersion
property at zero
. To enable our code to run on the emulator, we need to make this exception.
As an aside, we'll likely want to specify the requirement for OpenGL ES 2.0 in our application's AndroidManifest.xml
file as well, using the following string:
<uses-feature android:glEsVersion="0x00020000" android:required="true" />
Moving on, we will also create an instance of our custom Renderer
class and assign it to the GLSurfaceView
as the renderer to be used.
Finally, set the GLSurfaceView
instance as the content view of our application. This means that it'll now control what is visible on the screen.
The way we have structured the code here is such that the renderer just gets a Scene object instance. This means that we first have to initialize and set the model data we wish to have rendered on the Scene
we create before we assign it to the Renderer
instance.
Central to any model in this system is the Actor
class, which uses a Mesh
class and a ShaderProgram
class instance for the mesh
and shader
functionality, respectively. For each Actor
instance we wish to construct, we start by creating the Mesh
instance:
mMesh = new Mesh(); mMesh.setVertices(mModelData.getVertexArray(), mModelData.getNormalArray(), mModelData.getUvArray()); mMesh.setIndices(mModelData.getIndexArray());
The Mesh
object is loaded with the data we retrieved earlier via our model importer, namely the vertex, normal, UV, and index arrays. Next is the ShaderProgram
instance:
String vertexShaderCode; String fragmentShaderCode; try { vertexShaderCode = readFile(mAssetManager.open("shaders/vertex.glv")); fragmentShaderCode = readFile(mAssetManager.open("shaders/fragment.glf")); } catch (IOException e) { e.printStackTrace(); return; } if (vertexShaderCode == null || fragmentShaderCode == null) { return; }
For complete modularization and ease of maintenance, we have not hardcoded the code for both the vertex and fragment shader. Instead, they are files in the assets
folder of our project. This means that we have to use the AssetManager
instance we created earlier to get access to them. The resulting Java standard InputStream
is not directly usable as a string, and so we need to first convert it into its String
representation. For this, we use an additional function—readFile
:
private String readFile(InputStream stream) { BufferedReader reader = new BufferedReader(new InputStreamReader(stream)); StringBuilder result = new StringBuilder(); String line = ""; try { line = reader.readLine(); while (line != null) { result.append(line); result.append("\n"); line = reader.readLine(); } } catch (IOException e) { e.printStackTrace(); return null; } return result.toString(); }
This step may seem somewhat convoluted to those who haven't dealt with streams and files in Java. Essentially, we start off with an Java standard InputStream
, which is just a handle for some kind of stream. We then wrap it with an Java standard InputStreamReader
, which provides actual functions to do something with this stream. Finally, we use a Java standard BufferedReader
to give us the ability to read one line at a time using its readLine
function.
StringBuilder
is primarily there to optimize the process of concatenating individual strings. As the readLine()
function removes the newline character at the end, we have to add it again. While the shader may compile with everything in a single line, it would make debugging of a failing shader unnecessarily hard when the only indication of a fault is the line number.
Finishing the ShaderProgram
instance, we need to instantiate and load our shader code into it:
mShaderProgram = new ShaderProgram(); mShaderProgram.setVertexShaderCode(vertexShaderCode); mShaderProgram.setFragmentShaderCode(fragmentShaderCode);
With all of the data loaded, we can now set up the details of the scene, which largely involves setting up the positions and orientation of the models and the camera. For this, we use these variables:
float[] position = new float[3]; float[] at = new float[3]; float[] up = new float[3]; float[] side = new float[3];
The function of position
should be obvious. The at
, up
, and side
variables are vectors projecting from the object the position array is used with, indicating its orientation along the axes of the coordinate system it is placed in. We can now begin placing our Actor
class in the scene:
mActor = new Actor(); mActor.setShaderProgram(mShaderProgram); mActor.setMesh(mMesh); position[0] = 0.0f; position[1] = 0.0f; position[2] = 0.0f; mActor.setPosition(position); at[0] = 0.0f; at[1] = 0.0f; at[2] = 1.0f; up[0] = 0.0f; up[1] = 1.0f; up[2] = 0.0f; side[0] = 1.0f; side[1] = 0.0f; side[2] = 0.0f; mActor.setOrientation(at, up, side);
We place our Actor
at the center of the scene, at the coordinates (0, 0, 0). It's orientated with its top pointing upwards, facing forward along the positive z axis.
Next, we set up the camera:
mCamera = new Camera(); position[0] = 0.0f; position[1] = 0.0f; position[2] = -50.0f; mCamera.setPosition(position); at[0] = 0.0f; at[1] = 0.0f; at[2] = 1.0f; up[0] = 0.0f; up[1] = 1.0f; up[2] = 0.0f; side[0] = 1.0f; side[1] = 0.0f; side[2] = 0.0f; mCamera.setOrientation(at, up, side);
Place the camera 200 units behind along the negative z axis, facing the Actor object we just placed in the scene.
Finally, we add both Actor
and Camera
to the Scene class:
mScene = new Scene(); mScene.addActor(mActor); mScene.setCamera(mCamera);
It may seem easy at this point to get the scene object to the renderer instance, but directly calling a function in the renderer to transfer the data would result in an exception. The reason for this is that the renderer is implicitly transferred to a new thread, commonly called the OpenGL thread. This means that we have to use one of Java's cross-thread communication mechanisms for this operation:
mGLView.queueEvent(new Runnable() { public void run() { renderer.setScene(mScene); }});
Here, we use a Runnable
to accomplish this task. Using the reference to the renderer, we can call the queueEvent
method on GLSurfaceView
to schedule our Runnable
, which in turn calls the function in the renderer. This will assign it our data.
We're nearly done with MainActivity
at this point. One final thing we need to take care of is pausing and resuming. Whenever an application is paused, the OpenGL context that it was using will be destroyed. Upon resuming, it will have to be recreated. That's why we need to implement the onPause
and onResume
functions:
@Override protected void onResume() { super.onResume(); mGLView.onResume(); } @Override protected void onPause() { super.onPause(); mGLView.onPause(); }
These will call the similarly named functions in the GLSurfaceView
instance, allowing it to manage the OpenGL context.
ModelData
The ModelData
class is very simple, but plays a crucial role in transferring the model data from the importer library to the Scene
instance, where a Model
instance can be created using it. It has a number of variables and their respective getters and setters. These are the variables, which should look familiar:
private float[] vertexArray; private float[] normalArray; private float[] uvArray; private short[] indexArray;
This class contains one final function, which is often very useful. Often the vertex, normal, and UV data is transferred in a single, interleaved array. For this, we have a simple utility function:
public float[] getInterleavedArray() { int numVertices = vertexArray.length / 3; float[] interleavedArray = new float[8 * numVertices]; for (int i = 0; i < numVertices; i++) { interleavedArray[8 * i + 0] = vertexArray[3 * i + 0]; interleavedArray[8 * i + 1] = vertexArray[3 * i + 1]; interleavedArray[8 * i + 2] = vertexArray[3 * i + 2]; interleavedArray[8 * i + 3] = normalArray[3 * i + 0]; interleavedArray[8 * i + 4] = normalArray[3 * i + 1]; interleavedArray[8 * i + 5] = normalArray[3 * i + 2]; interleavedArray[8 * i + 6] = uvArray[2 * i + 0]; interleavedArray[8 * i + 7] = uvArray[2 * i + 1]; } return interleavedArray; }
For each set of parameters in the vertex, normal, and UV arrays, we copy them to the new array and return the result.
Renderer
As mentioned earlier, this class implements the GLSurfaceView.Renderer
interface:
public class Renderer implements GLSurfaceView.Renderer { private Scene scene; private float[] viewToProjectionMatrix;
The ModelData
and Model
variables are used when adding a new model to the Scene
instance. The processed variable indicates whether or not the model data has been processed and turned into a Model instance.
Further, we have the usual assortment of matrices: the Model-View-Projection matrix, which is used to translate the model into the world space; the projection matrix for the projection transformation; and the view matrix for the camera position and orientation.
Our onSurfaceCreated
function looks like this:
public void onSurfaceCreated(GL10 unused, EGLConfig config) { GLES20.glClearColor(0.9f, 0.9f, 0.9f, 1.0f); GLES20.glClearDepthf(1.0f); GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT); GLES20.glDepthFunc(GLES20.GL_LEQUAL); GLES20.glEnable(GLES20.GL_DEPTH_TEST); if (scene != null) { scene.onResume(); } }
Here, we set the background color and enable the depth buffer and depth test. We should also call the Scene instance's onResume()
function if we have a valid Scene instance, provided it has to perform any initialization functions. An example of this is upon resuming the application, when it may have to let the Actor instances it manages recreate any hardware buffers.
Out of the parameters provided to the function, we use nothing. The GL10
parameter is there only for compatibility with the older OpenGL ES 1.x API.
Next is the onSurfaceChanged
function, which is called every time the surface changes, such as when the device is rotated, causing the dimensions to change:
public void onSurfaceChanged(GL10 unused, int width, int height) { GLES20.glViewport(0, 0, width, height); float ratio = (float) width / height; Matrix.frustumM(viewToProjectionMatrix , 0, -ratio, ratio, -1, 1, 1, 1000); }
In this function, we set the viewport because it may have changed due to the surface change. We also create the view-to-projection matrix here, using the frustum function. Compared to the orthogonal function, it creates the proper perspective (scaling of items that are further away).
These were all the OpenGL setup functions. We just have the drawing function left:
public void onDrawFrame(GL10 unused) { if (scene == null) { return; } scene.draw(viewToProjectionMatrix); }
We start by checking whether the Scene
instance has already been assigned. We return from the function if we have no Scene
object to work with. Otherwise, we call the draw function of the Scene
object and provide it with the view-to-projection matrix we created earlier.
The catch with an OpenGL context on Android is that you'll encounter a lot of calls to onSurfaceCreated
, whether as part of the application launching process or due to application state changes. Therefore, when you get a call to onSurfaceCreated
, you cannot assume that it's the first time this function is being called. Instead, you'll have to verify the current state. This is the reason for checking whether some objects are null in this implementation.
In this particular situation, we also cannot assume that either our ModelData
object or Model
object is valid. This is due to the multithreaded nature of the application. We therefore have to keep checking in both the onSurfaceCreated
and onDrawFrame
functions to see whether anything has changed. The latter function is useful for ensuring that any new model data is processed, while the former is useful for recreating the application state, as we will still have a valid ModelData
instance if the OpenGL context is destroyed, but Model
will be invalid.
The setScene
function, which is referred to in the Runnable
implementation of the MainActivity
, looks like this:
public void setScene(Scene scene) { this.scene = scene; }
There is nothing too exceptional here. We assign the Scene
instance to its respective variable. This is all that the Renderer
needs to work with to draw the scene we just created. All of the drawing logic is contained within Actor
and the related classes.
This function is used by Models to create and compile the shader
code. Shaders are an integral part of OpenGL ES 2.0 and beyond, unlike OpenGL ES 1.x, which uses a static pipeline. A shader allows us to alter the processing of both vertex and fragment information. This gives us two different types of shaders, which is indicated by the type parameter of this function.
We create a shader that returns a shader identifier in the form of an integer to us. We use this identifier to add the provided shader code to the shader, which we then compile. If everything goes well, we have a functioning shader to return.
Scene
The Scene
class is another very basic class, but helps with the partitioning of an application into logical elements. Essentially, the renderer is only aware of a scene, with the latter managing the models:
public class Scene { private ArrayList<Actor> actors; private Camera camera; public Scene() { actors = new ArrayList<Actor>(); camera = null; } public void addActor(Actor actor) { actors.add(actor); } public void removeActor(Actor actor) { actors.remove(actor); } public Actor getActor(int i) { return actors.get(i); } public int getNumActors() { return actors.size(); } public void setCamera(Camera camera) { this.camera = camera; } public Camera getCamera() { return camera; } public void draw(float[] viewToProjectionMatrix) { if (camera == null) { return; } for (int i = 0; i < actors.size(); i++) actors.get(i).draw(camera, viewToProjectionMatrix); } public void onResume() { for (int i = 0; i < actors.size(); i++) { actors.get(i).onResume(); } } }
We provide a list for the Actor
instances as well as a Camera
variable, together with their getters and setters. Some further utility functions are also provided, though they're not used later in this application. This shows how you can extend these classes to provide further functionality.
This Scene
implementation can contain a virtually infinite number of Actor
objects, added through the addActor
function. The other function is the draw
function, which accepts a single parameter in the form of a view-to-projection matrix. This matrix is then passed to the draw function of each Actor
object contained in the list.
Finally, the onResume
function is called by the Renderer
when OpenGL-related items may need to be reset. Here, each Actor
instance is called again with its onResume
function to enable the same possibility for Actor
and its related classes.
We do not implement the onPause
method here because the Actor
class does not implement such a method. However, if it did, we would have had to call it from here in the same way as we called the onResume
method.
Camera
In OpenGL, a camera is a rather abstract concept. Basically, it's nothing more than a set of matrices used to transform model-specific matrices into a general coordinate system and thus enable them to be rendered on a 2D surface in a consistent and concise way.
We can see the relevant matrices as we look at the start of the Camera
class:
public class Camera { private float[] worldToViewMatrix; private float[] viewToWorldMatrix;
The worldToViewMatrix
matrix is used to transform world coordinates to view coordinates, while the viewToWorldMatrix
matrix is obviously the inverse. What we are most interested in here is the first matrix—world to view—as we can use this later to transform the models in the scene into the world coordinate system.
Next is the constructor:
public Camera() { worldToViewMatrix = new float[16]; viewToWorldMatrix = new float[16]; Matrix.setIdentityM(worldToViewMatrix, 0); Matrix.setIdentityM(viewToWorldMatrix, 0); }
Our matrices are 16 floats large in a 4 x 4 configuration. Finally, we set the identity matrix for both of these matrices.
We can now set the values for these matrices through a series of setters:
public void setPosition(float[] position) { worldToViewMatrix[12] = position[0]; worldToViewMatrix[13] = position[1]; worldToViewMatrix[14] = position[2]; Matrix.invertM(viewToWorldMatrix, 0, worldToViewMatrix, 0); } public void setOrientation(float[] at, float[] up, float[] side) { worldToViewMatrix[0] = side[0]; worldToViewMatrix[1] = side[1]; worldToViewMatrix[2] = side[2]; worldToViewMatrix[4] = up[0]; worldToViewMatrix[5] = up[1]; worldToViewMatrix[6] = up[2]; worldToViewMatrix[8] = at[0]; worldToViewMatrix[9] = at[1]; worldToViewMatrix[10] = at[2]; Matrix.invertM(viewToWorldMatrix, 0, worldToViewMatrix, 0); }
These functions allow us to set individual elements of the world to view matrix. After setting these new values, we use the invertM
function from Android's Matrix API to invert the matrix and obtain the view to world matrix.
Finally, we add a getter function for the world to view matrix, as we'll need it later on:
public float[] getWorldToViewMatrix() { return worldToViewMatrix; }
We can now move on to the classes that are directly related to the drawing of the model.
ShaderProgram
The code for the vertex and fragment shaders is, as mentioned earlier, contained in the asset files. Since the code in the ShaderProgram
class directly links with variables in these shaders, we'll list their code here. First is the vertex shader:
attribute vec3 positionAttrib; attribute vec3 normalAttrib; attribute vec2 uvAttrib; varying vec3 positionVarying; varying vec3 normalVarying; varying vec2 uvVarying; uniform mat4 actorToViewMatrix; uniform mat4 viewToProjectionMatrix; void main() { positionVarying = vec3(actorToViewMatrix * vec4(positionAttrib, 1.0)); gl_Position = viewToProjectionMatrix * vec4(positionVarying, 1.0); normalVarying = vec3(actorToViewMatrix * vec4(normalAttrib, 0.0)); uvVarying = uvAttrib; }
This is a fairly basic vertex shader. It mostly offers attribute handles for the position—normal and UV—together with uniform variables for the actor-to-view matrix and a new view-to-projection matrix. These two are used to transform the raw coordinates for the mesh to those of the world (scene) coordinate system.
The fragment shader is even more basic:
precision mediump float; varying vec3 positionVarying; varying vec3 normalVarying; varying vec2 uvVarying; uniform vec4 vColor; void main() { gl_FragColor = vColor; }
The only thing this shader does is assign a single color to every fragment, thus giving the entire model a single color.
Let's move on to the header of the ShaderProgram
class:
public class ShaderProgram { private String vertexShaderCode; private String fragmentShaderCode; private int glVertexShaderId; private int glFragmentShaderId; private int glShaderProgramId; private int glPositionAttribId; private int glNormalAttribId; private int glUvAttribId; private int glActorToViewMatrixId; private int glViewToProjectionMatrixId; private boolean glSynced;
Much of this should be familiar to you by now. We have the shader code strings, the IDs for the compiled code and the IDs for the attributes and uniform matrices in the vertex shader. The last variable, glSynced
, is a simple boolean
flag for indicating whether the data in the graphics hardware's buffer is still the same as the data in the local buffer or a refresh is required:
public ShaderProgram() { vertexShaderCode = null; fragmentShaderCode = null; glVertexShaderId = 0; glFragmentShaderId = 0; glShaderProgramId = 0; glSynced = false; } public void setVertexShaderCode(String code) { vertexShaderCode = code; glSynced = false; } public void setFragmentShaderCode(String code) { fragmentShaderCode = code; glSynced = false; }
After initializing the variables in the constructor, the vertex and fragment shader code can be set via setter functions. Both will also set glSynced
to false
so that the next time the shaders are required for drawing, the new code will be compiled and used.
For the case when the OpenGL context has to be restored, we may have to force this behavior, which is tied to the onResume
function mentioned earlier:
public void setDirtyOnHardware() { glSynced = false; }
Next is the meat of this class, in the form of the bind
function. This is called every time before a draw operation occurs, and it ensures that the right shaders are loaded:
public void bind() { if (!glSynced) { if (vertexShaderCode == null || fragmentShaderCode == null) { return; } if (glVertexShaderId != 0) { GLES20.glDeleteShader(glVertexShaderId); glVertexShaderId = 0; } if (glFragmentShaderId != 0) { GLES20.glDeleteShader(glFragmentShaderId); glFragmentShaderId = 0; } if (glShaderProgramId != 0) { GLES20.glDeleteProgram(glShaderProgramId); glShaderProgramId = 0; } glVertexShaderId = GLES20.glCreateShader(GLES20.GL_VERTEX_SHADER); GLES20.glShaderSource( glVertexShaderId, vertexShaderCode); GLES20.glCompileShader(glVertexShaderId); glFragmentShaderId = GLES20 .glCreateShader(GLES20.GL_FRAGMENT_SHADER); GLES20.glShaderSource( glFragmentShaderId, fragmentShaderCode); GLES20.glCompileShader(glFragmentShaderId); glShaderProgramId = GLES20.glCreateProgram(); GLES20.glAttachShader( glShaderProgramId, glVertexShaderId); GLES20.glAttachShader( glShaderProgramId, glFragmentShaderId); GLES20.glLinkProgram(glShaderProgramId); glPositionAttribId = GLES20.glGetAttribLocation(glShaderProgramId,"positionAttrib"); glNormalAttribId = GLES20.glGetAttribLocation(glShaderProgramId,"normalAttrib"); glUvAttribId = GLES20.glGetAttribLocation(glShaderProgramId, "uvAttrib"); glActorToViewMatrixId = GLES20.glGetUniformLocation(glShaderProgramId,"actorToViewMatrix"); glViewToProjectionMatrixId = GLES20.glGetUniformLocation(glShaderProgramId,"viewToProjectionMatrix"); glSynced = true; } GLES20.glUseProgram(glShaderProgramId); }
This function merely tells OpenGL to use a specific shader
program when everything is up to date. If not, it first ensures that any previously compiled code is deleted, compiles both sets of the shader
code, creates a new shader
program, and attaches the compiled shader
to it. Finally, it obtains the handles to the attributes and uniforms in the resulting shader
program.
The last function in this class is rather uneventful, as all it does is unbind the shader
program from the active context:
public void unbind() { GLES20.glUseProgram(0); }
Mesh
The Mesh
class is responsible for managing the mesh data, ensuring that the buffers with the mesh data are always present and loaded on the graphics hardware. The buffers we're managing with this class are the so-called Vertex Buffer Object (VBO) and Index Buffer Object (IBO). The former contains our interleaved data, matching the attributes we specified in the shader, and the latter contains the indices for the faces:
public class Mesh { private ShortBuffer indexBuffer; private int numIndices; private FloatBuffer vertexBuffer; private int numVertices; private int glIndexBufferId; private int glVertexBufferId; private boolean glSynced;
The Android GLES20 API wants us to use its buffer types instead of plain arrays, so we have to create them. We also want to keep track of the IDs of the buffers that we'll be creating. Finally, we have another glSynced
variable, much as in the ShaderProgram
class, for verifying the buffers on the graphics hardware.
Next, we move on to the constructor and the first functions:
public Mesh() { indexBuffer = null; vertexBuffer = null; numIndices = 0; numVertices = 0; glIndexBufferId = 0; glVertexBufferId = 0; glSynced = false; } public void setIndices(short[] indexArray) { numIndices = indexArray.length; ByteBuffer byteBuffer = ByteBuffer.allocateDirect(2 * numIndices); byteBuffer.order(ByteOrder.nativeOrder()); indexBuffer = byteBuffer.asShortBuffer(); indexBuffer.put(indexArray); indexBuffer.position(0); glSynced = false; } public int getNumIndices() { return numIndices; }
The constructor is just the usual initialization to safe defaults. Our first function allows us to assign an index buffer to this class. We don't store it anywhere, however. We store its size and then allocate a standard Java ByteBuffer
class of this size multiplied by the number of bytes in a short integer type, as we're going from a float
array to what is essentially a byte array. The reason we're creating a byte buffer rather than immediately wrapping the array we receive into a ShortBuffer
is the allocateDirect
function of the former. What it does is allocate a dedicated amount of memory for our date. It's an optimization—and a useful one.
We make sure that we use the native byte order on our new byte buffer. Then, we transform it into a ShortBuffer
and assign it to our indexBuffer
variable. The latter is what we then fill in with the data from our index array, after which we reset the read position on the buffer to the beginning.
The getter function that follows it should be no surprise, simply returning the count of indices. This will be useful later on.
Next is the function for setting the vertex information. This is a bit more complex than the one for the index data:
public void setVertices(float[] positionArray, float[] normalArray, float[] uvArray) { numVertices = positionArray.length / 3; float[] interleavedArray = new float[8 * numVertices]; for (int i = 0; i < numVertices; i++) { interleavedArray[8 * i + 0] = positionArray[3 * i + 0]; interleavedArray[8 * i + 1] = positionArray[3 * i + 1]; interleavedArray[8 * i + 2] = positionArray[3 * i + 2]; interleavedArray[8 * i + 3] = normalArray[3 * i + 0]; interleavedArray[8 * i + 4] = normalArray[3 * i + 1]; interleavedArray[8 * i + 5] = normalArray[3 * i + 2]; interleavedArray[8 * i + 6] = uvArray[2 * i + 0]; interleavedArray[8 * i + 7] = uvArray[2 * i + 1]; } ByteBuffer byteBuffer = ByteBuffer.allocateDirect(8 * 4 * numVertices); byteBuffer.order(ByteOrder.nativeOrder()); vertexBuffer = byteBuffer.asFloatBuffer(); vertexBuffer.put(interleavedArray); vertexBuffer.position(0); glSynced = false; }
The procedure here is mostly the same, although we start off with three arrays instead of one. Therefore, the first thing we do is create an interleaved array using the same approach as in the utility function shown in ModelData
. After creating this interleaved array, we go through the same steps of creating a FloatBuffer
instance here as we do for the index buffer.
Note that for the byte buffer size, we need the number of vertices. This is multiplied by the byte size per interleaved element (stride). This is three floats per position (vertex) and normal and two floats per UV multiplied by the 4 bytes per float.
Next, we have our function for forcing a refresh of the Vertex Buffer Object (VBO) and Index Buffer Object (IBO):
public void setDirtyOnHardware() { glSynced = false; }
Just as in the ShaderProgram
class, we have a bind
function and an unbind
function, which are called before and after drawing the mesh, respectively. First, we'll look at the bind
function:
public void bind(ShaderProgram shaderProgram) { if (!glSynced) { int[] buffer_ids = new int[2]; if (glIndexBufferId == 0 && glVertexBufferId == 0) { GLES20.glGenBuffers(2, buffer_ids, 0); glIndexBufferId = buffer_ids[0]; glVertexBufferId = buffer_ids[1]; } GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, glIndexBufferId); GLES20.glBufferData(GLES20.GL_ELEMENT_ARRAY_BUFFER, 2 * indexBuffer.capacity(), indexBuffer, GLES20.GL_STATIC_DRAW); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, glVertexBufferId); GLES20.glBufferData(GLES20.GL_ARRAY_BUFFER, 4 * vertexBuffer.capacity(), vertexBuffer, GLES20.GL_STATIC_DRAW); glSynced = true; }
In this section, we check the value of the glSynced
variable. If we're not synchronized, we need to create new buffers, bind our local buffers, and upload them on the graphics hardware. The first task is done by the glGenBuffers
functions, which will allocate and give us the IDs of the new hardware buffers.
We then bind each local buffer by specifying the type of buffer (the element array is for indices, and the regular array is for vertex or interleaved data). Finally, we use the buffer data function, which performs the actual transferring of data to the hardware buffer. We give it the size of our local buffer in bytes and also give it a hint about how the buffer is likely to be used. Since we don't expect our data to change much, we will indicate a static draw. For applications where we expect the data to change often, we should use the dynamic or stream option:
GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, glIndexBufferId); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, glVertexBufferId); GLES20.glEnableVertexAttribArray(shaderProgram.getGlPositionAttribId()); GLES20.glVertexAttribPointer(shaderProgram.getGlPositionAttribId(), 3, GLES20.GL_FLOAT, false, 8 * 4, 0); GLES20.glEnableVertexAttribArray(shaderProgram.getGlNormalAttribId()); GLES20.glVertexAttribPointer(shaderProgram.getGlNormalAttribId(), 3, GLES20.GL_FLOAT, false, 8 * 4, 3 * 4); GLES20.glEnableVertexAttribArray(shaderProgram.getGlUvAttribId()); GLES20.glVertexAttribPointer(shaderProgram.getGlUvAttribId(), 2, GLES20.GL_FLOAT, false, 8 * 4, 6 * 4); }
With all the buffers in place on the graphics hardware, we can simply tell OpenGL to bind (use) our buffers, after which we can enable our shader-based attributes and obtain the handles for them.
Our final function is the unbind
function:
public void unbind(ShaderProgram shaderProgram) { GLES20.glDisableVertexAttribArray(shaderProgram.getGlPositionAttribId()); GLES20.glDisableVertexAttribArray(shaderProgram.getGlNormalAttribId()); GLES20.glDisableVertexAttribArray(shaderProgram.getGlUvAttribId()); GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, 0); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, 0); }
Here, we disable our vertex attributes again and unbind our buffers. This resets everything, readying things for the next drawing operation by this or another Actor
instance.
Actor
This class can be considered to be a central nexus in the scene model. While the ShaderProgram
and Mesh
classes handle the implementation details of the drawing, it's the Actor
class that ties it all together:
public class Actor { private ShaderProgram shaderProgram; private Mesh mesh; private float[] actorToWorldMatrix;
The general structure of this class can be seen from its class variables. We have the ShaderProgram
and Mesh
instances, followed by an actor-to-world matrix, which is what translates the Actor's local coordinates to the world (scene) coordinates.
Again, the constructor for this class is quite basic:
public Actor() { shaderProgram = null; mesh = null; actorToWorldMatrix = new float[16]; Matrix.setIdentityM(actorToWorldMatrix, 0); }
We set sensible defaults and also set the identity matrix for our new matrix. We then get the usual setters:
public void setShaderProgram(ShaderProgram shaderProgram) { this.shaderProgram = shaderProgram; } public void setMesh(Mesh mesh) { this.mesh = mesh; }
This is followed by more specific setters for the matrix:
public void setPosition(float[] position) { actorToWorldMatrix[12] = position[0]; actorToWorldMatrix[13] = position[1]; actorToWorldMatrix[14] = position[2]; } public void setOrientation(float[] at, float[] up, float[] side) { actorToWorldMatrix[0] = side[0]; actorToWorldMatrix[1] = side[1]; actorToWorldMatrix[2] = side[2]; actorToWorldMatrix[4] = up[0]; actorToWorldMatrix[5] = up[1]; actorToWorldMatrix[6] = up[2]; actorToWorldMatrix[8] = at[0]; actorToWorldMatrix[9] = at[1]; actorToWorldMatrix[10] = at[2]; }
Here, we can set the position and orientation of our model, relative to the world space:
public void onResume() { mesh.setDirtyOnHardware(); shaderProgram.setDirtyOnHardware(); }
To enable reloading of OpenGL resources upon resuming the application, we need to implement our onResume
function and use it to make our Mesh
and ShaderProgram
instances recreate any resources on the graphics hardware.
All that's left now is the draw function, which puts everything together:
public void draw(Camera camera, float[] viewToProjectionMatrix) { if (shaderProgram == null || mesh == null) { return; } shaderProgram.bind(); mesh.bind(shaderProgram); float[] actorToViewMatrix = new float[16]; Matrix.multiplyMM(actorToViewMatrix, 0, camera.getWorldToViewMatrix(), 0, actorToWorldMatrix, 0); GLES20.glUniformMatrix4fv(shaderProgram.getGlActorToViewMatrixId(), 1, false, actorToViewMatrix, 0); GLES20.glUniformMatrix4fv( shaderProgram.getGlViewToProjectionMatrixId(), 1, false, viewToProjectionMatrix, 0); GLES20.glDrawElements(GLES20.GL_TRIANGLES, mesh.getNumIndices(), GLES20.GL_UNSIGNED_SHORT, 0); mesh.unbind(shaderProgram); shaderProgram.unbind(); }
After a quick sanity check to see that we have valid Mesh
and ShaderProgram
instances, we call bind
on both of them, providing the Mesh
instance with a reference to the shader
program so that it can obtain the attributes from it.
Next, we use the view-to-projection matrix we obtained as draw parameter to transform our local actor-to-world matrix into an actor-to-view matrix. This is used to transform the coordinates of our mesh in such a way that it conforms to the coordinate system used by the camera, or view.
This matrix is then set in our vertex shader, together with the view-to-projection matrix, where the shader
code will use it in the transformation process.
We can now draw the model using the glDrawElements
call. Here, we specify the primitive type that we wish to draw, which are triangles, and specify the size of the index buffer in the elements as well as their type. We conclude with an offset to the interleaved data buffer.
The model should now be drawn and be visible on the screen. Finally, we must do some cleanup before we can pass the control back. For this, we call the unbind
function in both the Mesh
and ShaderProgram
instances. We are now done with this Android OpenGL ES 2.0 example.