Skip to main content

Gamma correction

In imaging, especially in computer imaging, gamma correction is something that both gets largely ignored, and can be a real nightmare at the same time.

In order to fully understand gamma correction, we need to understand how human vision, as well as display devices, work.

Our perception of brightness is dependent on the amount of energy carried by light. In other words, the more energetic the light is, the brighter it looks.

However, the relationship is not linear. One could hastily think that if we double the amount of energy, it will look twice as bright. However, the human eye doesn't work that way. Doubling the amount of energy does not double the perceived brightness. The actual function is quite complex. Thus we have two different concepts:
  1. The absolute amount of energy carried by light. The technical term for this is radiant flux.
  2. The perceived brightness of the light. The technical term for this is luminous flux. (Yes, it's confusing.)
As said, the relationship between the two is not linear, and in fact is quite complex. It can be, however, approximated with a logarithmic curve. (Although, complicating things even further, the curve depends also on the wavelength of the light, as the eye perceives different wavelengths in different manners.)

Normally we don't need to worry about this difference. However, there are some applications, such as computer 3D rendering, where, if the application wants to be as physically accurate as possible, the distinction needs to be made. For instance, the amount of light reflected by a surface is a function of the cosine of the angle between the incoming light and the normal vector of the surface. However, it's a function of the radiant flux, not the luminous flux, which is the perceived brightness of light. (For example, at 60 degrees the amount of reflected light is half of the amount that's reflected at 0 degrees. Half of the radiant flux that is, not half of the perceived brightness.)

Another example is image dithering: If you want to reduce the colors of an image and use dithering, you need to be aware of gamma correction if you want to retain the perceived brightness of the result. (The reason for this is that, for example, a "checkerboard pattern" of white and black pixels does not look the same as an area of half-gray pixels in brightness. This is due to the difference between luminous and radiant flux.) What's worse, reducing color images to an image with less colors while preserving perceived brightness of the colors is a lot more complicated.

But in computing, this is only half of the issue. The other half is the display device. You see, display devices don't emit light in a linear manner either. And to complicate things, different displays may have different emission curves. This means that without any kind of correction the exact same image may look noticeably different with different displays.

For the casual user this seldom makes a significant difference. However, for professional graphics artists, publishers and so on it can make a huge difference. If you, for example, design a beautiful image for the cover of a magazine in your computer, and then when it's printed you find out that the white balance and color balance are completely off in the printed version, you won't be very happy. And this is just one example of many.

Gamma correction is an attempt to fix this issue. Its intent is that if an image file has gamma correction data in it, the image will always look (nearly) identical in all (properly-configured) systems regardless of the differences in the physical properties of their hardware.

(The most common gamma for modern displays is 2.2. Quite curiously this corresponds very roughly to the perceived brightness of the human eye as well. This means that a pixel with relative RGB values of (0.5, 0.5, 0.5) will look approximately half as bright as a pixel of (1.0, 1.0, 1.0). In other words, the relationship between physical pixel values and perceived brightness is almost linear. This is quite convenient.)

The problem is that gamma handling has never been universally standardized, and it has become a complete and utter mess. There isn't even a universal agreement on which part of the entire system should be handling the gamma correction. Should it be the image file itself? Should it be the image viewer program? The operating system? The display driver? The graphics card? The display itself? Obviously if two of these (or anything in-between) ends up doing gamma correction, you will end up with over-correction, and the image will look like crap. (The difference between gammas 1.0, ie. no correction, and 2.2 is extremely visible. It's not like it's something subtle that only a trained eye can see.)

The PNG image file format is infamous for messing up gamma correction really badly. It almost ended up destroying the popularity of the file format.

By far the most common use for PNG images is in web design. Usually web pages do not care about gamma correction of its images, and instead aim for a consistent visual design.

What was one of the major problems related to this? Well, imagine that you make a web page that has a background color of (128, 128, 128), and you have a PNG image with some pixels having the color (128, 128, 128). Obviously you would expect the two colors to look identical, and would often rely on this when designing a web page. Often (especially when PNG was still new) you would be wrong, and ended up with the PNG's half-gray looking completely different from your web page's half-gray background. Why?

Because the creators of the PNG format wanted to support gamma correction, did it poorly, and the whole thing got extremely messy.

Some web browsers supported the gamma info in PNG files, while others didn't. This meant that if the PNG had gamma info in it, it would look different in different browsers.

Fine, just leave the gamma info out of the PNG completely. The file format supports that. But no, and here comes the screwup. You see, the PNG standardization committee stipulated that if a PNG file has no gamma information, the software displaying the image should nevertheless use an assumed gamma value.

Of course some browsers implemented this standard, while others didn't. So even if you left the gamma info completely out of the PNG file, it would still look different in different browsers. (And, what's worse, the half-gray in the PNG would look different than the half-gray page background color in the standard-conforming browsers. And if you tried to fix that with the PNG file, it would look different in the non-standard-conforming browsers.)

Needless to say, none of this was a problem with either GIF or JPEG, neither of which had any gamma info messing things up to begin with.

Comments