There's more than one way to stretch your data . . . and the truth.
What does it mean to stretch your image? Simply put, stretching means scaling your data. Last month I talked about how data from a 12- or 14-bit camera might be scaled to fill the 16-bits of data range available for manipulation by image-processing software. This scaling is linear. For example, images from a DSLR camera range from 0 to 16,384 (14-bit), and that can be scaled by multiplying by 4 to cover the 16-bit range (0 to 65,535) for image processing, or it’s divided by 64 to make it fit the 8-bit range (0 to 255) that can be represented by your computer display.
Now, there are those who say that once you’ve stretched the data, you’ve destroyed it. Or, when you manipulate the data from the camera, you’re not getting a true representation of the object. As a computer science professional and graphics expert, I just roll my eyes when I hear this. Otherwise, I'd start an argument, which is much like the one I had with my daughter when she was six and insisted that the Moon followed us home from grandmas, because she saw it with her own eyes.
In fact, it's surprisingly hard to display a RAW image from a DSLR camera without any adjustments. Almost all software stretches data just to show it on a screen. (Let's skip the fact that the un-manipulated data isn’t in color — we'll talk about that some other time.) Here’s a colorized image from a well-exposed daytime shot right off my Canon DSLR:
Pretty dark, isn’t it? That’s because it’s 14-bit data displayed in 16-bit containers like I talked about last month. When mapped to your computer’s display it has approximately one-quarter of the brightness range that it should have. Even if we scale it up by a factor of 4, you may still not be all that impressed.
It’s still pretty dark even though the data range covers the full 16 bits (and is still linear). That’s right, the brightness values go up into the 65,000s. When mapped to your display, the pixel values go all the way up to 255, at least in the green channel. Did you notice that the image looks a bit green, not to mention flat and somewhat featureless? “Photography should be about what comes out of the camera, not the processing” . . . yeah, right. A modern DSLR does a tremendous amount of processing for you. If you are using a scientific camera for astrophotography, you have to correct for all kinds of things yourself, or else.
What does the above image look like if we open it in a typical DSLR RAW conversion program, and then do, say, 15 seconds worth of work on it? Ta-da:
This image looks a lot more like what I saw on the side of the road somewhere in Kansas. It’s not even remotely linear now. Neither, by the way, is the human visual system. I’m going to let that work on you for a minute or two. If it sinks in correctly, then you too can roll your eyes the next time you hear some "expert" claim that astrophotography is all just fantasy. Such experts are in love with data, to the point that they don't grasp what “photography” is about. I’m not sure a better example of “missing the forest for the trees” can be found.
So, why am I using a regular terrestrial image to demonstrate this when we are talking about astrophotography? Because if I used an astrophoto, someone would probably still try and argue with me about what’s next, which is that astrophotography isn't special, (other than how we feel about it of course!). It’s just really, really, REALLY low-light photography. We have to pay a lot more attention to things we take for granted when there’s an abundance of light. And the low-light conditions means the process of stretching our data becomes a little more interesting.
Other than that, yep, we worry about the same stuff as in regular photography: We still have to color balance, stuff often looks too green right off a color camera, and we have to stretch the living daylights out of our data to get a good result. And if your image is under-exposed (which means it doesn’t have a good signal-to-noise ratio), it’s going to look terrible when you start stretching it.
Let’s look at a typical monochrome astrophoto and its histogram. This is M31, the Andromeda Galaxy. The core shows up without any stretching, but most of the image is very dark. This result isn’t just about bit depth, either; it's because the image is very dark. As you can see in the histogram beneath it, most of the pixel values are down at the bottom of the brightness range.
So, how about a nice linear stretch? Turns out we have to multiply the entire image by 256 to be able to see most of the galaxy well. Ugh, but look what happened to the middle part of the galaxy! It’s all solid white and saturated, and the histogram shows a tall, bright line all the way to the right. We’ve lost all the details in the center of the image because the middle is so much brighter than the outer part of the galaxy.
If you want to properly represent this object, you can't just do a linear stretch of the data. Perhaps one day, computer displays will have more than just 256 brightness levels; until then, we have to use a nonlinear stretch to compress the brightness range. This scaling helps us brighten the dark areas (the shadows) but not the already-bright areas (the highlights).
We usually perform this stretch with a "curve" tool, which applies a brightening factor to the image information in a nonlinear way. The end result is that we brighten up the shadows and kinda-sorta leave the highlights alone, or at least brighten them much less than the shadows.
If you need to maintain the linearity of your data for scientific reasons, then a linear stretch can still make the data more interesting to look at, while still maintaining its scientific integrity. However, when it comes to non-linear stretches, I want you to consider a few things: The human visual system is non-linear (in fact, it's close to logarithmic), and we are capable of seeing a far wider dynamic range than any computer display can reproduce without some fudging of the data. We perform nonlinear stretches on photos of birds, sunsets, and bridal showers all the time, and no curmudgeons get on soap boxes about how the images are useless and misleading.
Perhaps we should just not argue anymore about what things we can’t actually see should look like!