We begin the first part of a 3-part blog series on the subject of narrow band astro-imaging. Today’s discussion is more general on the topic and principles. The subject of color in the universe is frankly loaded with controversy. Many are convinced we have no idea what the true color of a deep space object is from our perspective here on Earth. If you have been to a star party and looked through a telescope of the Orion nebula what do you see? You see a green glow if it’s dark enough. You certainly don’t see the bright magenta and pink-red colors that just about every image of the Orion nebula portrays. This is because the eye is pretty useless as a color detector at night. What you can see from a nebula is likely to be greenish because that is what the eye is most sensitive to. The color receptors in the retina called “cones” are mostly inoperative at night. That doesn’t mean the colors aren’t there. For the most part it seems that those of us who spend a lot of time imaging the cosmos and having done so over the last 15-20 years have been able to come to a general consensus regarding the appearance of certain deep space objects when imaged using standard digital methods. By that I mean for a color image you are doing essentially the same thing you do when taking a picture of your family for example with a smart phone. Inside the camera is an array of red, green and blue filters which when combined gives you a color image . By terrestrial criteria, the colors are “real”. The sky is blue, a pine tree has a deep forest green color, a typical tomato is red, etc. When applied to astronomical objects you can use the same digital camera which has the filter array as mentioned or you can use a ccd or cmos sensor with separate filters and combine the greyscale images into a single color image. When you do that you see that emission nebulae are a reddish color, reflection nebulae are blue, galaxies have a soft yellow core with rust colored dust lanes and some have areas of active star formation which are reddish pink. This is pretty consistent and is true for what I call standard “broad band” images.
In the usual case you have the red, green and blue filters each selecting a fairly broad region of the visible spectrum, about 1-200 nanometers and slightly overlapping. This produces your standard amateur astronomical image. An unfiltered image can also be taken which is the “luminance” image. Actually the luminance filter does filter out the infrared but allows all visible wavelengths to be recorded at one time. This image typically has the best signal with the least noise and can boost the image quality by adding this to the color image.
I think we are now pretty comfortable with what the approximate “accepted” colors are for most deep space objects using these methods. However, during the time we were settling into this comfort zone, out there in space the Hubble Space Telescope was also imaging the universe. While the standard “LRGB” methodology of combining red, green and blue filtered images with an “L” or unfiltered image having better signal to noise properties can work well for our small ground based amateur telescopes, the Hubble and the large professional ground based observatories can’t do this with scientific data. The reason is that broad band filtering results in the loss of wavelength-specific structural information. They have to image at very specific wavelengths with the goal of increasing contrast between areas of energy emission from an object and what is called the “emission continuum” or the background. As a result of excluding particular wavebands, image quality is actually improved. And so it was in the mid-late 1990’s the birth of “narrowband” imaging occurred, thanks to the Hubble telescope!
Instead of imaging through filters selecting 1-200 nm of spectral bandwidth, narrowband filters only allow a few nanometers of light to pass through. These filters are as a result detecting very specific wavelengths of light which is typically emitted by gases in space. The most common elements that emit this light are hydrogen and oxygen. Sulfur is also fairly abundant. This is why hydrogen alpha filters which transmit light from atomic hydrogen at 656.3 nanometers, sulfur II which transmits the deep red light emitted by singly ionized sulfur and oxygen III transmitting the green-blue wavelength of 500.7nm, are the 3 most commonly used filters for narrowband imaging.
Next time we discuss the iconic Hubble image that changed the way both professionals and amateurs approach color and more on the narrowband imaging concepts.
Thanks for reading!