Late last year I posted an article on finding an asteroid, serendipitously, in my data from CFHT (see the post here). Well…I found another one! Luckily, this new asteroid happen to pass by a beautiful example of an edge-on spiral galaxy.
can you see it? There are about 20 min between the first two exposures, and almost 4 hours before the third.
You may be wondering, though, why the three images above do not look like the beautiful pictographs we are used to seeing (these, for instance). Why are my images black and white? Why is that edge-on galaxy so boring looking? Let me explain.
We take images of the sky using a CCD, which is just an array of light sensitive pixels. These pixels are sensitive to a rather large range of light, extending down to the near-UV, through optical, into the near-infrared. But, the pixels do not care what kind of light they detect. When a photon hits the pixel, a photon of any wavelength,* the pixel records it. In order to make the beautiful astronomy pictures we see, it takes a few more steps:
First, you have to filter out the colours you want individually. The above three images are the same patch of sky imaged using three different filters: the g filter (390nm – 595nm), the r filter (595nm – 680nm), and the i filter (680nm – 830nm).** While all the images look black and white, they are in fact exposed using three completly different ranges of colours of light. Again, the pixels do not care what colour. Second, you have to combine the various filtered light images into one, while imposing a colour scheme that matches what the colour should be in the wavelength ranges of the filters that you used. For instance, since the above image uses g, r, and i, I would add the images together with X amount of light that is coloured near 500nm for g, Y amount of light coloured near 650nm for r, and Z amount of light coloured near 750nm for i.
A final thought: For those of you who have looked through a telescope at things like the Orion Nebula, Andromeda Galaxy, the Ring Nebula, or any other cool astronomical object, you may notice that these objects appear to be black and white to our eye as well, and not the vibrant images I linked to just now. There is no CCD involved, so why are they black and white? The answer lies in our eye’s biology. We have two different and highly specialized, light sensing cells in our eye: rods and cones. Rods are used to detect light vs. dark, contrast, brightness. They see in black and white. Cones are used to detect different colours; there are three kinds of cones for red, green, and blue (RGB). Cones are far less sensitive than rods, in fact rods can sense light up to 100x dimmer than cones. So when we look through a telescope at a distant nebula, there is, in fact, RGB light travelling through space to us, but it is not intense enough to activate our cones. The rods DO perceive the light, and give us a great contrast image of the astronomical object.
Do not take this to mean that there is no colour in space, and that we photoshop it in. No, the colour is there, we just need to tease it out!
* Wavelength is the measure of the distance from peak-to-peak of the wave light. Ultraviolet light is 10nm to 400nm, optical light is from 400nm to 800nm, and infrared is 800nm to 2500nm. These are rough numbers.
** You can see my research post on how I sorted my CFHT targets into different redshift bins using the filters here. It also has a nice graph showing the wavelength range of the u,g,r,i,z filters.