A number of engineers all had the same response to the sRGB/Gamma thread on the Mac OpenGL list: life would be a lot easier if color were linear. Yes, it would be easier. But would it be beautiful?
The answer is: not at 8 bits, and definitely not with DXT compression.
The following images show a gray-scale bar, quantized to: 16, 8, 6, and 5 bits per channel. (16-bits per channel would be typical of a floating point, HDR, or art asset pipeline, 8-bits is what most apps will have to run on the GPU, and 5/6 bits simulate the banding in the key colors of DXT-compressed textures, which are 5-6-5.)
In the images labeled "srgb" (gamma is 1.0) the colors are quantized in sRGB (non-linear) space. Becuase sRGB is perceptually even, the banding appears to be even to a human - it's a good use of our limits bits. 8-bit color is pretty much smooth, and artifacts are minimized for 5 and 6 bits (although we can definitely see some banding here.)
Now what happens if we quantize in linear space? You'd get this:
Note: the program generates these ramps in sRGB space (hence they are "evenly spaced", converts to linear, quantizes, then converts back. So this is what your textures would look like if your art assets were converted to and stored linearly.
What can we see? Well, if we have 16-bits per channel we're still okay. But at 8-bits (the normal way to send an uncompressed texture to the GPU) we have visible banding in the darker regions. This is because linear isn't an efficient way to space out limited bits for our eyes.
The situation is really bad for the 6 and 5-bit compressed textures; we have so little bandwidth that the entire dark side of the spectrum is horribly quantized.
The moral of the story (if there is one): gamma is your friend - it's non-linear, which is annoying for lighting shaders, but when you have 8 bits or less, it puts the bits where you need them.
No comments:
Post a Comment