Last time we discussed the basics regarding color in the universe and typical “broad band” color imaging, how it differs from narrow band imaging and how the eye is not well adapted for night time color detection but despite that we have come to a consensus regarding “natural color” in the universe. We also saw how the Hubble Space Telescope is almost always utilizing narrow band filtering, either in the visible spectrum but very often outside of the visible realm (e.g infrared, ultraviolet etc). This method improves resolution by focusing on just the physical phenomemon being observed and “filtering out” the background.
About 25 years ago now the HST (Hubble Space Telescope) pointed its’ “Wide Field Planetary Camera 2” WFPC2 at a central region in the Eagle Nebula M16 and produced the iconic “Pillars of Creation” image shown above. This was obtained with the narrow band filters SII, H-alpha and OIII. The tremendous public response to the release of this image was so overwhelming, it changed everything from the standpoint of how the space science community would handle the HST data and created a new philosophy toward how to deal with it. It became necessary to understand and explore further how color schemes could be used to show depth, motion and texture in an astronomical object. Furthermore once it was realized how this image and others like it could inspire the public and generate enthusiasm for astronomy in general, resources had to be allocated to processing and presenting these images for the lay public. For that reason the “Hubble Heritage” project was created which consisted of a team of both scientists and I believe at least 1 amateur astronomer. A 2005 paper was published for the Astrophysical Journal entitled “Image-Processing Techniques for the Creation of Presentation-Quality Astronomical Images” (Rector, et al) for the sole purpose of demonstrating how to generate color images from Hubble’s data. It might be the only professional scientific paper that doesn’t have even a single differential equation in it! This is a fascinating read and for those of you doing any image processing it even goes into some detail how to use Photoshop in its’ early days for combining and enhancing color images, and even how to use “clone and stamp” to remove star blooms! Since 2005 there has been an explosion of image processing software. The PDF article can be downloaded here.
Of course with this being a scientific article, the concept of color and how to represent it had to be “scientific” as well, rather than simply concluding “this color looks good” etc. As it turns out Isaac Newton bails them out because he is the discoverer of the color wheel!
“Color is parameterized … in an “HSV” colorspace wherein a specific color
is defined by its hue, saturation and value. “Hue” describes the actual color, e.g., yellow
or yellow-green. “Saturation,” also often referred to as intensity, is a measure of a color’s
purity. A purely desaturated color is mixed with an equal amount of its “complementary”
color. Drawing a line across the color wheel from a color, through the center of the wheel, to the color on the direct opposite side of the wheel defines a color’s complement. For example, purple is the complement of yellow. A purely saturated color contains none of its complement. ”Value” is a measure of the lightness of a color, i.e., the amount of white or black added. Other terms used to modify color are ”tint” and ”tone.” A ”tint” is a color with added white; e.g., pink is a tint of red. A ”tone” is a color with added gray and a ”shade” is a color with added black.(Rector et al 2005)
Furthermore “In general a good starting strategy when assigning color is to space the color assignments evenly around the color wheel; e.g., if creating an image with three datasets, assign undiluted colors that are 120 degrees apart on the color wheel. This is known as a triadic color scheme and is the simplest example of hue contrast”.(Rector et al 2005)
The challenge was to come up with a scheme or approach to the situation where color is not obtained through the usual optical broadband red, green, blue filters to produce a “natural” color image. This resulted in the term “false color” or “representative color” being used.
The best explanation of the “scientific approach” cannot be better stated than in this excerpt from the Rector article:
“An example of an image produced using the chromatic ordering scheme is the Hester &
Scowen (1995) HST image of M16. The data (Hester & Scowen 1995) were obtained with
the HST WFPC2 camera with the F502N ([OIII] _5012_A), F656N (Halpha _6564_A) and F673N ([SII] _6732_A) narrowband filters. To generate the image, the three datasets were colorized blue, green and red respectively. Filter F502N was assigned blue because it has the shortest wavelength passband,filter F673N was assigned red because it has the longest wavelength passband, and filter F656N was assigned green because its passband is of intermediate wave-length compared to the other two. The chromatic ordering for the HST M16 image is not a natural color scheme because the filters were not assigned their visible colors. That is, when viewed against a bright, white light, the F502N filter appears to be green, not blue, to the human eye. Similarly, the F656N filter looks a deep red, not green. Only the F673N filter is assigned its perceived color, red. It is also not a natural color scheme because it uses narrow-band filters that pass only a very small fraction of the visible spectrum. Furthermore, rather than photometrically calibrating the projected images they are balanced so that the H-alpha data doesn’t dominate and turn the image completely green. In comparison, a natural color image of M16 (Shoening 1973) shows the nebula as deep red, again because of the strong H_alpha emission from the nebula. The color assignment chosen is an extreme version of the hue contrast. The image is also an example of a split complementary color scheme. The blue (from the [OIII]filter) and the green (from the H_ alpha filter) combine to generate a background of greenish-cyan, whose complement is a slightly-red orange. Areas of the pillars are yellow-orange and orange-red which are on either side of this orange on the color wheel, hence the complement is split.
Also the RGB assignment to each layer, along with the intensity stretches, ensured that
the combined datasets produced cyan, magenta and yellow, resulting in a strong contrast
of hue in the undiluted primaries of the subtractive system. Where the intensity of [OIII]
(blue) and H_ (green) are balanced, cyan appears in the background. The H_ (green) and
[SII] (red) images combine to produce yellow along the edge of the pillars. And, while the
center of the stars are white due to the equal combination of RGB, their halos are magenta because the F673N filter broadens the point-spread function. Itten (1990), in his section on this contrast of hue states, the undiluted primaries and secondaries always have a character of aboriginal cosmic splendor as well as of concrete actuality,” a statement that applies to the HST image of M16. The two supporting contrasts, (split) complementary and light-dark further boost this image’s appeal.”
Well basically what I take from that is you assign your Red-Green-Blue in the order of wavelength of your filter band-passes meaning SII, longest wavelength goes to red, H_alpha goes to green and OIII goes to blue. This is the basis of the so-called Hubble Palette or “S-H-O” and this approach has since trickled down to the greater amateur astroimaging community! Who said that astroimaging is not science!
Next time I will show a recent example from my first ever attempt at processing an “S-H-O” image.
Thanks for reading!