Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
412 views
in Technique[技术] by (71.8m points)

iphone - Fragment Shader - Average Luminosity

Does any body know how to find average luminosity for a texture in a fragment shader? I have access to both RGB and YUV textures the Y component in YUV is an array and I want to get an average number from this array.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

I recently had to do this myself for input images and video frames that I had as OpenGL ES textures. I didn't go with generating mipmaps for these due to the fact that I was working with non-power-of-two textures, and you can't generate mipmaps for NPOT textures in OpenGL ES 2.0 on iOS.

Instead, I did a multistage reduction similar to mipmap generation, but with some slight tweaks. Each step down reduced the size of the image by a factor of four in both width and height, rather than the normal factor of two used for mipmaps. I did this by sampling from four texture locations that were in the middle of the four squares of four pixels each that made up a 4x4 area in the higher-level image. This takes advantage of hardware texture interpolation to average the four sets of four pixels, then I just had to average those four pixels to yield a 16X reduction in pixels in a single step.

I converted the image to luminance at the very first stage using a dot product of the RGB values with a vec3 of (0.2125, 0.7154, 0.0721). This allowed me to just read the red channel for each subsequent reduction stage, which really helps on iOS hardware. Note that you don't need this if you are starting with a Y channel luminance texture already, but I was dealing with RGB images.

Once the image had been reduced to a sufficiently small size, I read the pixels from that back onto the CPU and did a last quick iteration over the remaining few to arrive at the final luminosity value.

For a 640x480 video frame, this process yields a luminosity value in ~6 ms on an iPhone 4, and I think I can squeeze out a 1-2 ms reduction in that processing time with a little tuning. In my experience, that seems faster than the iOS devices normally generate mipmaps for power-of-two images at around that size, but I don't have solid numbers to back that up.

If you wish to see this in action, check out the code for the GPUImageLuminosity class in my open source GPUImage framework (and the GPUImageAverageColor superclass). The FilterShowcase example demonstrates this luminosity extractor in action.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...