Stripe 82 and colour images from Sloan
As Chris blogged yesterday, Galaxy Zoo now contains colour images from the Sloan Digital Sky Survey’s Stripe 82.
The new Stripe 82 images you see are made of an addition of approximately 50 ordinary SDSS images, which means we can see things about 2 magnitudes fainter (about 7 times less light). It’s only over a relatively small patch of the sky – 270 square degrees compared to the full survey which is nearer 8000 square degrees – but the extra depth should be useful to us in many ways even though it’s only over a smaller area.
Many users will already have noticed that the standard images supplied through the Navigator interface are in black and white – they’re just in the r band (SDSS has 5 bands named u, g, r, i and z of which g, r and i are normally mapped to blue, green and red respectively). Naturally, we wanted to supply the Zoo with colour images like those in the ordinary Sloan survey.
This proved to be somewhat tricky as access to the data needed to compile the colour images comes in fairly large chunks called ‘fields’ – each field is 2048 by 1489 pixels, large enough to more than fill a typical computer monitor. So we had to download several of these images for each galaxy (one for each of the red, green and blue parts), combine them together, and extract just the bit around the galaxy we wanted to show you, scaled to the size of a normal Galaxy Zoo image. This took a fair bit of programming and many days worth of computer time for the downloads of the data and processing.
Two images from the same area, the upper from a standard SDSS image, and the lower from the coadded stack of about 50 images
Another complication was that astronomical images usually come in a format that’s not immediately suitable for viewing. They can have a tremendous dynamic range, from the tiny amount of light that comes from the dark bits of sky to the dramatic overloading the camera gets when it images a bright star. The task of reducing this down to fit in the brightness scale of 256 levels of each of red, green and blue that a computer monitor will display is not easy.
Fortunately, this problem had already been solved. For those interested in the gory details, there is a paper here by Lupton et al., which describes the system used in these images and the ordinary SDSS images you’ve been seeing before. It’s a bit mathematical, but what it does is three important things:
- For faint objects, the brightness of the pixel is roughly in proportion to the amount of light received. This shows up faint details nicely but…
- For bright objects, we’d hit the cap of 256 levels on the monitor too quickly, so it starts to scale things logarithmically instead. This means that doubling the light wouldn’t double the value of the pixel but just add a certain amount on. If 10 units of light were collected we might have a pixel value of 1, 100 units would be 2, and 1000 units would be 3, and in this way we can fit the brightest objects nicely into the range our monitors provide us with. This is the same way that astronomical magnitudes work as well.
- Lastly, we want to get the colour of objects right. If we had a really bright object we might find that even though it was very red, it still gave off so much green and blue light that the pixel ended up with high values of red, green and blue, and end up looking white as a result. The code we use compensates for that and makes sure everything has its actual colour represented properly.
In order to use this, we need to decide on how steep the slope of our conversion from light to pixel values is, and also at what point we tilt over from our function for faint light to our different function for strong light sources. This takes a bit of fiddling, and to be honest it’s as much an art as a science, and we have to use different values to the ordinary SDSS images as we have a different amount of light overall in them. This is why our background sky ends up looking more speckled than usual (there’s more background noise and having it more visible is the price we pay for having faint features of galaxies visible too) and the galaxies themselves look like they’re stretched differently.
One of the developers of this technique, Robert Lupton, has a webpage which shows part of the Hubble Deep Field coloured the conventional way and using this technique, and you can see how the colours of the galaxies are better preserved this way.
I hope that gives a bit more insight into these images, how they were made and why they look a little different from the usual. We look forward to finding out how the classifications go!