Wednesday, May 11, 2011

Android OpenGL ES texture management.

My Android game is still 'coming along'. I'll be making an announcement post soon. It's consumed most of my late evening "dad time", I haven't touched a soldering iron in a while, I'll make up for lost time later.

Progress on the game has generally gone like this: bang out some framework code, bang out cool feature, cool feature, find a minor bug, spend the next few evenings pulling my hair out, framework code, cool feature, cool feature, rinse, repeat. This post is about one of those hair pulling moments and my solution. It revolves around Android's handling of the OpenGL texture buffer management API's.

Recently, I was reworking my texture management code in my "engine" (if you can call it that, it's really just a few folders of Java files with a focus on lower level Graphics and my own Collision detection). I wanted to add the ability to have "dynamic" textures, ones that could be created and modified in the middle of a game in progress. Android has been fun to work with, but the there is a dearth of documentation when it comes to the 3D stuff, you are typically refereed to external Open GL ES docs, or to the SDK samples, which invariably don't do what *you* are trying to do.

Disclaimer: Though I have been in the video game business for a long time, I have rarely done low level graphics rendering work. I would not consider myself a guru in that area, so take this post as food for thought and not as an authoritative doctrine. If you think I'm wrong on something, call me out!

A "texture" in OpenGL is nothing more than a buffer, typically containing an image, that resides in openGL managed memory, in a format appropriate for the hardware. On a PC, these textures often reside in "Video Memory" on the graphics card. One does not manipulate this buffer directly, but controls its contents through the open GL API. Most games have a lot of "static" textures, created by artists, and are applied to mesh geometry in game, or used as inputs to shader code running on the graphics chip.

The general process for using predefined textures in a game work like this: (I'm ignoring things like texture units, and mips for simplicity)

  1. Load image file and process into a chunk of bytes.
    1. On Android, you can use the class. There are utility functions that will accept a Bitmap object.
  2. Create a GL texture buffer "handle"
    1. This is done with glGenTextures
  3. "Bind" that handle as the "current" texture, this means all subsequent texture calls will be acting on this texture.
    1. This is done with glBindTexture
  4. Process and/or upload the chunk of bytes to the GL texture buffer.
    1. GLUtils.texImage2D in the Android sdk does this, with a Bitmap object as input.
Once uploaded, the original source buffer is not needed, and the memory may be freed. 

There is a caveat, various things outside of your control may cause all the GL buffers to be destroyed and reclaimed by the system. This might happen in Android if you switch to another app, another Activity, or if the device goes to sleep. When it happens you must re-create the GL buffers using the steps outlined above. 

I have a "ManagedTexture" class in my engine that holds a Bitmap, and when the rendering context is created, I have a Texture Manager that "uploads" all the textures to OpenGL. It also supports "lazy uploading" in that it wont actually create the open GL texture until someone actually uses that ManagedTexture. (That means only step 1 above is done on game start, steps 2-4 happen "on the fly")

Now for dynamic textures...

I have the need to have objects that I can draw onto with a object. The Bitmap could be anything, sometimes it is text, but it doesn't have to be. When something happens that changes the bitmap, a "dirty" flag is set, and the next time a frame is rendered, GLUtils.texImage2d is called again to upload the changed Bitmap into the GL buffer. One common use is to then draw this texture on a camera-aligned quad and viola, a "dynamic sprite". 

My most recent frustration...

I have several Android devices for testing. In my experience, "fragmentation" hyped to be worse than it really is, but it does exists. From a developer point of view, 3D graphics are one of the areas where you tend to run into inconsistencies between devices.

 I had been running/debugging for a while on the emulator and on my personal phone, an LG Optimus series device. I periodically switch what device I'm running on just to make sure nothing broke.

I recently added a new "notification" system to the game, where I create a sprite on the fly, and I can 'script' and animate a bunch of it's properties. When I ran it on one of my alternate devices, I noticed a brief white flicker whenever a new notification popped up on screen. I had not seen this problem on my phone, sonofa!!!! Somewhere Steve Jobs is cackling...

To make a long story slightly less long, I tracked the issue down to this: when I try to render anything using a texture object I just created in that render-pass, it would render as a white texture for that frame. On the next render pass, it would render fine. There is a small chance that there is something else going on, but I'm 99% certain (which probably means I'm wrong :) ). I tried all sorts of ideas, like calling glFinish, glFlush, then I tried hacks, like uploading the textures twice. Nothing worked. 

Since I did not *notice* the problem on all devices, I did some comparative debugging. With some properly placed breakpoints, I could catch a white square where the texture should be rendering. There was nothign I could find out of the ordinary on the first render pass. Then I ran the same code on my phone where I had not seen the problem. At the breakpoint, I noticed what appeared to be a faint *black* texture rendering. Since I had alpha blending turned on, and also because my game uses a mostly black background, it was hard to notice. But: The same problem existed on all devices but manifested in different ways. Come to think of it, I think the problem didn't show up on the emulator, but I don't count that. The Android emulator is notorious for allowing you to do things that wont work  on most real hardware, such as using texture sizes that are not powers of 2. 

Side bar:
Fyi, Using Android-x86 in conjunction with Virtual Box can be nicer sometimes than using the stock emulator, it's often faster, but it's pretty much an outlier in hardware compatibility and useless if you're developing any Android Native code. You also have to setup ADB to work over TCP which is slightly more effort.


My 'solution' to my problem is that all any rendering code that selects a texture must check a return code from  my render state manager, and bail out of that render pass if the return code indicates that the texture is newly created in the frame. This is annoying. However, in game development, it's really not uncommon to have something that lags a frame or two because it may not be fully initialized until a full render or 'update' cycle has completed. Before I worked in games, I worked in simulation, where you have to be a lot more careful about that kind of stuff.

I'm not really happy with the solution, it feels more like a workaround than a fix. Either I ran into a widespread bug in Android's OpenGL implementation, or I missed that day in school where they covered the dos and don'ts of generating texture buffers in a render pass.

-- P


  1. I have re-introduced this bug several times since posting this. At times, I considered pulling this post because I suspected I was just way off. There are two threads involved and a lot of shared bytes between them. Right after I posted this, I did some optimizations, ones that caused me to inadvertently use Textures without letting a GL frame pass after creating them.

    As of now, I still stand by the premise originally stated: when you create a texture with glGenTextures and use glTexImage2d, you must then wait until the next frame before actually using it.

    The frame-delay code annoyed me to have to use, but it is not too onerous(bugs notwithstanding), since I handle it at a pretty low level in my "engine".


  2. Just a guess, but they probably didn't expect you to be creating textures in the middle of a frame (during onDrawFrame) but instead should be calling queueEvent(Runnable) to inject the code between frames

  3. That's a great catch Jamie, that sounds like it could achieve the same thing as my work-around, but in a much cleaner way. I just released my App, in Beta to the Android Market this week. I'm overdue for an optimization and cleanup pass, and I'd like to focus some on m graphics code. I'm also not currently using VBOs, and from what I read, those can really boost performance.

  4. You also made me realize that comments are on unmoderated for old posts, which I thought I had set to moderate comments on posts older than 2 weeks. Grateful for the feedback though!

    I have in my list of things to-do, to make a basic Android project that reproduced the issue I saw here, so I could follow up to this. Admittedly, that's low on my priorities, so it might take a couple months at least.

  5. I know this is a bit of an old post, but I do hope you are watching the comments still. I am developing directly on a tablet now and seem to have run into a perplexing problem, all texture names are being marked as incomplete on each render pass, similar to what you have mentioned but not identical. Do you have any information about this kind of behavior?

    1. I do not keep up on comments as much as I should, they have a way of sneaking up without me realizing. It's been sooo long since I worked on Sedition or Android rendering, that I don't have any insight above what's in the post above.


I welcome you're thoughts. Keep it classy, think of the children.