How does your camera and image processing software affect and determine colors?

by

posted Friday, March 31, 2023 at 9:00 AM EDT

 
 

Color is a huge topic in photography. You'll be hard-pressed to wander through online photography forums without tripping over a group of shutterbugs arguing over which manufacturer's "colors are best." Like warring cavemen bashing each other with clubs, even the "winner" of a colorful debate, if there is such a thing, won't leave the battle unscathed.

Photography expert Jim Kasson has returned to the Lensrentals Blog to give as close to a "final word" on camera color as you can get. Kasson is extremely well-equipped to tackle the topic of color, and his detailed article clears up many common confusions. So, is there a camera that produces better colors?

Before getting to that pressing question, we must consider JPEG versus RAW. And no, I don't mean why you should definitely be shooting in RAW format, but the difference in color information is between JPEG and RAW files.

 
 

While cameras process JPEG files in-camera and apply specific color toning depending upon selected settings, RAW files don't include inherent color values. RAW processing software and an image sensor's response to light influence the colors in the final image, with the software on your computer having the larger impact between the two. The software's color profile is of particular importance.

My philosophy background is banging on the door, wanting to enter the conversation. There's also a key distinction between subjectivity and objectivity that must be made when discussing color. Light and how an image sensor translates light into color values can be measured objectively. Accepting that color, or at least spectra, exists as a true, factual characteristic of stuff in the world, it's also likely the case that these colors can be measured using tools to attain objective color information. That is, information about color that isn't dependent upon any individual.

Ignoring the rather troublesome elephant in the room that objective data must be collected by subjective individuals, and that the very framework by which an individual collects and analyzes objective information has subjective underpinnings, there's no doubt that any individual's color perception is not objective but is subjective. We all experience color differently via remarkably complex physical and psychological processes. While people with typical color vision can certainly agree on which colors are which, it's impossible ever to know if someone else perceives any color in the same way as you.

 
 

If the discussion is about which camera makes colors look better, then "better" must be explained. If a Nikon's colors are "better" because you like them more, then the discussion is relatively worthless. It's great that you like something and equally great if someone likes something else. Done and dusted.

However, if "better" colors mean more accurate colors, then there's hope for some progress to be made. At the very least, there are ways to have a fruitful discussion on color within this context, accepting the role of RAW processing. It's also important to accept that you don't necessarily need to prefer more accurate colors, and there's nothing wrong with "better color" really referring to what you like. Photography, while intertwined brilliantly with science, is art.

As Kasson writes, "Color is a complex subject. I spent six years doing color science research for IBM, and there are still many things that the psychologists know that I don’t, and more things the psychologists don’t know." His article at Lensrentals is dense, full of science, and brimming with math. It's an illuminating read for those who want all the nitty-gritty details, or at least as many as Kasson could fit in a single article.

 
 

 

Here, I'll just hit the highlights, at least those I can grasp enough to summarize. "Color" is worth expanding upon further. "When color scientists use the word color in the context of reproduction of the real world, they mean something more specific. First, color is a psychological, not a physical, concept. It describes how color-normal people perceive spectra; it does not describe the spectra themselves. Second, color is quantitative. It is defined by three numbers. What the three numbers mean depends on the system – the technical term is color space – being used," writes Kasson.

Kasson explains this topic in fantastic detail, but we can move forward with the understanding that color reproduction fundamentally relies upon converting spectra, intensities and frequencies of light waves, to color. "And, by definition, for color-normal people, you do that by applying a set of color weighting functions to the input spectra," writes Kasson, adding, "In an ideal world, you would do that for every pixel in the image. But that’s not how consumer cameras work."

When different spectra convert to the same color, they're metamers and that occurrence is a "metameric match." In converse, when someone expects a metameric match but one doesn't occur, that's metameric failure. There are different reasons this can occur, including how various physical properties reflect light, how individual observers perceive color, and how different cameras capture color.

 
 

One of my favorite real-world examples of metameric failure includes sports uniforms. Have you ever been watching a hockey game and noticed that a team's blue helmet doesn't match their blue pants, which in turn doesn't match their blue gloves? While the "blue" in question is a specific blue with certain color values, the materials of the different parts of the uniform are physically different and reflect light in various ways.

The only way to deal with this is by tuning specific colors used on different materials to appear to be the same. But even still, different people may see the blues differently. And further, different cameras may see them differently, too. Sometimes colors must be "wrong" in order to appear "right" to color-normal viewers.

Jumping way ahead by going a bit backwards, what's in a RAW file? There are three types of data in a RAW image file: RAW image data, a JPEG preview (what you see on the camera when you're shooting RAW), and metadata. The RAW image data is "the response of each of the camera's pixels to the light that fell on them during exposure." Many cameras use a color filter array (CFA), so the RAW image data is actually monochromatic. The CFA interprets the data into red, green, and blue.

While software, color space, color profile, and RAW processing are a huge part of the color you see in a final image, the sensor itself does matter. The image sensor has spectral characteristics, including within the pigments or dyes in the color filter array, the spectra filtering of the infrared filter in front of the sensor, and the spectral response of the silicon itself. We're ignoring Foveon sensors for now because they work differently.

"If all hot mirrors (infrared rejection filters) were perfect, and all CFA filter spectra combined with the silicon frequency response were a linear combination of the responses of the human eye and the raw developer was correctly designed and optimized for accuracy, there would be no difference at all in the color obtainable from various brands and models of cameras," writes Kasson. However, "As stated earlier, that condition – known to color aficionados as the Luther-Ives criterion – is met by precisely zero consumer cameras."

What that basically means is that no camera, in conjunction with a RAW developer, is perfectly optimized to deliver accurate color. Imperfections, therefore, can give rise to differences in how different cameras capture color.

 
Adobe Standard

Cameras are (imperfectly) calibrated to have the colors in an image match the colors in a scene – the light that actually hit the image sensor. However, most photographers, myself included, don't necessarily want their photos to have accurate color. "Accurate" color often appears dull, washed out, and, for lack of a better word, bad.

Beyond seeking accuracy, color profiles also try to achieve a look that people generally enjoy. If you've heard the term "memory color," then you're well on your way to understanding this aspect of color profile engineering. Memory colors, like the blue sky or someone's rich skin tones, are artificially altered by cameras to make colors appear in a way that "seems" more accurate and lifelike, even though they aren't.

"Most photographers don’t want their images to have accurate color. They look flat that way, and skin tones look pallid. The second part of the color profile is used to get the 'look' that pleases most people," says Kasson. "Different distortions from accurate color seem to work best in some circumstances and not in others. Different photographers prefer different color mappings. For these reasons, different profiles are supplied by most RAW developer producers."

 
Adobe Vivid

For example, if you open a RAW image in Adobe Lightroom or Adobe Camera Raw, you can select from different Adobe color profiles, like Adobe Standard, Adobe Color, Adobe Portrait, Adobe Landscape, and Adobe Neutral.

Photographers are at the whim of profile design. We can't dig into individual calibration or intent components and tweak them. However, we can see how calibration and intent affect different cameras, which is itself interesting.

Kasson also explains that some differences in color are due to differences in the camera. Software processing and profiles cannot explain every visible difference. "As an example, let’s imagine that a Sony A7R IV sees two spectra that resolve to different colors as the same. No profile will be able to tell which of those spectra produced which set of values in the raw file, and the two different colors will look like the same color in the final image," Kasson writes. Now consider a Nikon Z7 in that situation but with other different-color spectra being seen as the same. The Sony and Nikon cameras will produce different colors from a scene with these described spectra.

Now consider that every image sensor is prone to slightly different performance. The same applies to lenses, displays, photo paper and ink, and our eyes. There are innumerable ways in which light can interact in varied ways with different objects and parts of the overall photographic process. Color science is an immense topic that one could spend a lifetime investigating.

However, at the end of the day, cameras do see color differently, and image processing is perhaps the biggest and least considered part of the debate on which cameras produce the best color. But perhaps most importantly, how you see and feel about color matters too.

To read much more about cameras, image processing, and color, head over to Jim Kasson's full article at Lensrentals


Image credits: Lensrentals and Jim Kasson