Which is better for astrophotography, a color or a mono camera? Both of course!

Previously, I've pleaded with beginners to start with monochrome cameras. They are more sensitive and produce black-and-white images that are easier to learn to calibrate and adjust. Black-and-white pictures can be quite beautiful and reveal subtle contrast and nuance in ways that color cannot. It wasn't my most popular blog. People just want color.

filter wheel
Colored filters that revolve into place in front of a monochrome imager.
Richard S. Wright Jr.

Okay fine, but monochrome cameras can produce color images, too, when paired with a filter wheel. Is that a better route than using a one-shot color (OSC) camera? Well, that's like asking which is better, a car or a truck? That depends on what you want out of your vehicle. I know people with both a car and a truck, and just like them I use both approaches to create my color images.

The most versatile imaging approach is to use a monochrome sensor paired with a filter wheel. With this configuration, you can shoot color pictures using red, green, and blue filters. You can also shoot an image with no filters, which you can then combine with your color data to greatly improve its signal-to-noise ratio.

Additionally, monochrome cameras can capture images at particular regions of the spectrum, even beyond visible light, when combined with narrowband filters. Analyzing light at these specific wavelengths can teach us more about the imaged target than what can be gleaned from an RGB color image.

RGB Composition
A separate red, green, and blue exposure is combined to make a high S/N ratio color image.
Richard S. Wright Jr.

Users of color cameras sacrifice some of this creative freedom (and scientifically useful data) for the convenience of taking only one exposure to create a color image. To its credit, OSC data is much less work to process than assembling a color image from several separate color channels. But some mistakenly believe that a single OSC exposure gives you three times the data of a monochrome camera, which requires three exposures.

Not quite.

This narrowband image shows ionized Hydrogen, Oxygen, and Sulphur
gRichard S. Wright Jr.

Regardless of whether you’re shooting M42, the Orion Nebula, or your child’s birthday party, the sensor in a color camera is made up of individual pixels with tiny filters that record red, green, and blue light. The detector is divided into a patterned grid, known as a Bayer filter, that contains 25% red-filtered pixels, 50% green, and 25% blue. The final output image uses this color data but fills in the gaps between pixels with a value derived from adjacent pixels in a mathematical process called interpolation. This works pretty well and is the basis of almost all color cameras available today.

But OSC camera users should understand that the filled in data isn't “real”, nor does it help the all-important signal-to-noise-ratio of your image. A single OSC exposure receives ¼ of the amount of red and blue data, and ½ of the green data compared to a single exposure with a monochrome camera and filter.

There is no shortcut. Ten minutes of data is 10 minutes of data regardless of whether it’s monochrome or one-shot color. A color sensor is really just a mono sensor with the color filters deposited on top of the pixels.

So, do you really get the same results from OSC and mono cameras? Again, not quite. Compared to a high-transmissive color filter in front of a monochrome camera, the Bayer matrix blocks about 30% more of the light, and with some manufacturers as much as 50% additional light might be blocked. That’s a lot of lost signal.

Debayer Process
Creating a complete color image with OSC camera data.
Sky & Telescope

In terms of raw performance and S/N ratio using equivalent sensors, mono wins hands down . . . if you think like an engineer. Now let's think about it as an end user.

Processing OSC data is much easier than making a color image out of monochrome data. For some people, this is a big deal (especially if they're not engineers and don’t enjoy spending extra time processing data on their computer!). All that's really needed is more exposure time for OSC data to catch up with the monochrome approach. This is a perfectly valid trade-off.

Same night, same optic, Sony 814c sensor vs. 814 mono w/filters. Both images are 12 minutes total exposure, but the bottom one (RGB) exhibits slightly less noise and a stronger red and blue response (stretched, as close to raw as possible).
Richard S. Wright Jr.

Additionally, today’s sensors are much more sensitive than just a few years ago. If you aren’t comparing the same sensors, a newer generation OSC sensor may well have better sensitivity and/or lower read-noise than an older-generation monochrome camera. If you’re upgrading an old mono camera to a newer color camera, you might actually see a nice boost in performance, despite taking the OSC route.

Workflow is important too. I’ve done many imaging runs where I have 3-hours of red data, 45-minutes of green data, but no blue data because the weather turned unexpectedly cloudy. Do you need to refocus between filters? Sometimes no, and sometimes this is a showstopper. Are you traveling and so need to get all your data in one unpredictable night? Is a filter wheel just one more thing that can go wrong while you are on the road? Target selection is also important: Maybe you're photographing a bright target, and a little loss in signal isn't going to be too big of a deal. Maybe too you have a very fast optic like a RASA f/2 system or a very fast camera lens. In those cases, you're getting so much signal so fast that exposing just a little bit longer is no serious burden at all.

In all of these cases, a color camera might be the better bet. So, car or truck? Sure, a truck can carry a heavier load, but sometimes it's nice to just put the top down and enjoy the ride.


You must be logged in to post a comment.