Another shader in the GPUImage framework that doesn’t take any inputs other than the current pixel color is the luminance filter. This blog post will not only go over this very simple shader, but will also go into the science behind how the algorithm was developed.

How Humans Perceive Color

Luminance is the overall brightness of a specific image. If you look at a black and white image, you’re looking at the light present in the image minus the color. The luminance filter should accurately represent the image minus the color.

You might think that you would need an equal representation of red, green, and blue in the image, but humans perceive the brightness of different colors differently. Specifically, humans are most sensitive to colors within the green spectrum.

In fact, video data is commonly encoded using a YCbCr color spectrum, emphasizing the Y component, which represents luminance. Many image sensors have twice as many green sensors as red and green. This will be covered in more detail in later blog posts.

Fun fact: In the 1960 movie Psycho, the substance used for blood during the stabbing scene in the shower was chocolate syrup. Using a blood red substance for blood does not show up well on black and white film, so they needed to use a darker substance to show up accurately.

The flip side of luminance is chrominance. Chrominance is the saturation and color of the specific pixel. Human beings are more sensitive to brightness than saturation. This makes sense evolutionarily. If you’re wandering around at night where there are a lot of predators, your vision needs to be sensitive to differences in brightness and movement rather than being sensitive to how green a plant is. Color can give you information about whether a plant is poisonous or not, but fine differences in shades of red don’t really give you an advantage the way that brightness does.

Luminance Shader

in GPUImage 3, we have a file called OperationShaderTypes.h. This file contains our shared vertex structures and common constant values used in multiple shaders. One of those is our algorithm for luminance weighting:

constant half3 luminanceWeighting = half3(0.2125, 0.7154, 0.0721); 

This algorithm came from Graphics Shaders: Theory and Practice. We use these values in the luminance fragment shader:

fragment half4 luminanceFragment(
	SingleInputVertexIO fragmentInput [[stage_in]],
	texture2d inputTexture [[texture(0)]])
	constexpr sampler quadSampler;
	half4 color = inputTexture.sample(quadSampler,
	half luminance = dot(color.rgb, luminanceWeighting);
	return half4(half3(luminance), color.a);

The meat of this function is this line:

half luminance = dot(color.rgb, luminanceWeighting);

One of the better explanations I have found on the dot product is on this site. What this line of code is doing is taking the red, green, and blue values of the input pixel and multiplying them by the weights we set for the luminance then summing the results. The red value is being multiplied by 0.2125. The green is weighed by a whopping 0.7154 while the blue only gets 0.0721. These three weights added together equal 1.

If you were to calculate the luminance of white, it would look something like this:

(1.0 * 0.2125) + (1.0 * 0.7154) + (1.0 * 0.0721) = 1.0

The goal is that you do not wind up with a value that can ever be above 1.0. If all of the red, green, and blue values were weighted equally, then a desaturated red would look exactly the same as desaturated green and desaturated blue. Even though these would all be the same shade of gray, it would still look off to our eyes because we perceive these colors differently, as seen below:

This weight is then applied as the value for all three color channels to ensure the image is a shade of gray:

return half4(half3(luminance), color.a);


This is still a relatively simple shader. Many of the shaders in the GPUImage framework build off of luminance, so understanding this concept will help you later with more complex shaders.

Next, we’re going to start covering shaders that require properties to be shared between the CPU and the GPU and you will learn how to encode those values into buffers.

Colorized version of the featured desaturated image

Leave a Reply

Your email address will not be published. Required fields are marked *