HDR Rendering on iOS (iPad2/iPhone 4S)

Hey iDevBlogADay,

I’m excited for todays post, because this has been sitting on my desk for a while. With the public release of iOS 5.0, I can finally release this stuff. I’m talking about a little HDR tech demo that I wrote during the iOS 5.0 beta to figure out how, and how well, HDR rendering works with the new A5 only OpenGL ES 2.0 extensions, specifically GL_OES_texture_float, GL_OES_texture_half_float, GL_OES_texture_half_float_linear, and GL_EXT_color_buffer_half_float. I’ve made the tech demo public on github. I wont go into all the details (you can see the details on github), but rather focus on the issues I had creating the demo and working with those extensions. I’m not really going to talk about what HDR is or what benefits are gained from HDR, but focus on the particular details for implementing an HDR render on iOS.

The demo

The demo itself is very simple. It features two cubes spinning in the center, with a HDR lightprobe from Paul Devebec’s page in the background. The tonemapping is a very basic linear tonemap. I’ve added a couple of controls to play with the individual options, and the parameters of the tonemap (scale and bias). There is also a histogram of grayscale values. The demo currently only runs on the iPad2, but an iPhone 4S port should be straight forward.

HDRDemo on Youtube

Performance

The performance on the iPad 2 is pretty good. The main performance killer is the glReadPixels-based histogram computation — turning off the histogram significantly improves the performance.

 Creating the Floating-Point FBO

Settings up the floating point FBO was actually a lot more difficult than I anticipated. In theory, creating an HDR FBO is as simple as setting the color target to a different format. However, there are a couple of pitfalls, and it’s mostly related to glTexImage2D. The signature is

void glTexImage2D(GLenum target, GLint level, GLint internalFormat, GLsizei width, GLsizei height, GLint border, GLenum format, GLenum type, const GLvoid *data);

The important parts here are the internalFormat, the format and the type. The internalFormat has a lot of different options, like GL_RGB8, GL_RGB10, etc, so it seems to have a lot of effect, but ultimately it seems all it does is describe the number of color components. Initially I tried putting parameters like GL_RGBA16F_EXT, assuming that that’s the right value as it’s defined in the extension and seems to have no other use. But it turns out the only right parameter is either GL_RGB or GL_RGBA (while probably just “3” or “4” would work as well). The really important part are the format and type parameters. These define how the texture data is stored. Using GL_RGB/GL_RGBA  as the format and GL_HALF_FLOAT_OES as the type did the job. Note that higher precision GL_FLOAT is not supported. Additionally, make sure all your texture parameters are set up correctly, and are set before glTexImage2D is called. That should do the trick. The details for FBO creation are in FramebufferObject::Create in framebufferobject.mm.

Creating the Histogram

Again, this was actually more difficult than expected. For some reason glReadPixels does not seem to work with GL_HALF_FLOAT_OES and just returns a cryptic OpenGL error. This means we have to use GL_FLOAT which has a big impact on performance, as the driver has to convert the data from 16-bit half to 32-bit float during the copy from the GPU memory to the main memory. The way I ended up setting this up to minimize the performance impact is that as soon as the data has been downloaded from the GPU, I kick off a GCD job on a separate thread to compute the histogram. The details can be seen in drawInRect in ViewController.mm.

Creating a Lightprobe Environment Cube Map

Since Paul Devebec’s Lightprobes are a great classic to demonstrate the fidelity in amplitude that is gained with HDR rendering, I really wanted to add these to the demo. To do that, I had to first slice the original lightprobe cross into separate pfms, that need to be rotated and aligned. I used trial and error to find the right mapping between the sides. All this is in the little python script in tools/lightprobe_to_hdrtex.py. It should work with any lightprobe from Paul’s page. Once that’s done, the same remarks for glTexImage2D that I mentioned above also apply here.

And that’s pretty much it! Developing this demo took quite a while because of all those small issues that I kept running in to, but I hope that since this stuff is figured out now, it should be easy for everyone to add HDR rendering to your apps.

I’m do not currently have concrete plans for the future of the demo. I may add iPhone 4S support, and maybe some other HDR related features (such as bloom filtering, different tonemapping operators, etc), and I also very much welcome any commits and patches for this repository!

Cheers,

Volker

5 comments on “HDR Rendering on iOS (iPad2/iPhone 4S)

  1. I do agree with all the concepts you’ve introduced on your post. They are really convincing and will certainly work. Still, the posts are very quick for newbies. May just you please extend them a bit from next time? Thanks for the post.

  2. You are writing:
    “Using GL_RGB/GL_RGBA as the type and GL_HALF_FLOAT_OES as the format did the job”, however the signature is format first, type second, so it should be “Using GL_RGB/GL_RGBA as the format and GL_HALF_FLOAT_OES as the type did the job”. This is also flipped in your code.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

39,722 Spam Comments Blocked so far by Spam Free Wordpress

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>