Observing Run: Raw Data versus Finished Product

So let’s say you have a galaxy:

Bulgeless AGN 2: SDSS

And you know this galaxy has a growing black hole, and probably hasn’t had any significant mergers, because it has very little, if any, bulge. Which means you have two questions: 1) what counts as significant? and 2) how little is very little?

To answer the first question, you’d like to look for the faint stellar streams that signify the remnants of a minor merger. The optical images you already have aren’t even close to deep enough to see something like this:

NGC 5907: the Splinter galaxy. Credit: R. Jay Gabany

But if you could see that for your galaxy, you could start to put together its minor merger history and answer that first question.

Of course, that kind of depth is not easy. The group that took that data most likely spent weeks observing that one source, and there are many technical challenges involved. You may be in luck, though: you have a bigger telescope, which means you probably only need one night to get a single-filter optical image at the same depth.

So you go to the telescope, and you take some data. After 5 minutes, this is what you have:

bulgeless_agn_2_reallyraw

Which … doesn’t look so great, actually, until you clean it up a bit by correcting for the different effects that come with a huge mosaic of CCD chips, like different noise levels and so forth. Luckily, the people who wrote the code to observe with this instrument have provided a “first-look” button that automatically does that pretty well:

bulgeless_agn_2_raw

That’s better. You can see that even with 5 minutes of observing time, you’re close to the depth you already had. To get what you need, though, you don’t need 5 minutes of exposure. You need 5 hours.

But you don’t want to just set the telescope to observe for 5 hours and hit “go”. In fact, you can’t do that. If you do, those well-behaved little stars near your galaxy will be so bright on the detector that they’ll “saturate”, filling their pixels with electrons that then spill out into nearby pixels. This detector in particular doesn’t handle that very well, so you need to avoid that. And what if something happens in those 5 hours? What if a cosmic ray — or many — hits your detector? What if a satellite passes over? What if the telescope unwraps? While that looks kind of cool:

unwrapping the telescope

When the telescope rotates to +180 degrees, it stops tracking and goes -360 so that it can keep tracking from -180. Otherwise cables plugged in to walls + twirling round and round = unhappy telescope.

It wrecks the whole exposure. Plus, those chip gaps are right where your stellar streams might be. You’d like to get rid of them.

So you solve all of these problems at once by observing multiple exposures and moving the telescope just a little in between exposures.

bulgeless agn 2 dithered

This is called dithering.

You end up getting your 5 hours’ exposure time by doing lots of dithers — about 50 of them, to be exact, mostly between 5 and 10 minutes apiece. This has several advantages and a few disadvantages. You can throw out any weird exposures (like the unwrap above) without losing very much time, but then you have to combine 50 images together. And that, frankly, is kind of a pain.

And this is a new instrument, and the reduction pipeline (the routines you follow to make the beautiful finished product) doesn’t fully exist yet, and what does exist is complex — and, for the moment, completely unknown to you.

So the beautiful finished product will have to wait.

In the meantime, you have a few more galaxies to look at, and that second question to try and answer, on future nights and in a future blog post.

Tags:

2 responses to “Observing Run: Raw Data versus Finished Product”

  1. Budgieye says :

    I’m sure there is lots more that could be said about the amount of work the processing entails. But I like the unwrapping image: I didn’t know that it happens.

Leave a comment