Saturday 30 April 2016

Galaxy Quest

At last, an image of M101. This is a quick and dirty stack of six ten minute sub-frames, aligned using Photoshop. There is a lot of electronic noise in the frames which I would like to remove.



A pass through http://nova.astrometry.net/ confirms the pointing:


And here's a crop, with some contrast tweaking, of the galaxy itself:


The string of stars in the lower left corner (and elsewhere) is the result of individual hot pixels being shifted and duplicated as I did the manual alignment. This is the kind of electronic noise which I'd like to remove when I get a bit more time.

Thursday 28 April 2016

Slow Bullets on the Hugo ballot

A couple of weeks ago I was contacted by the Hugo administrators to let me know that my novella "Slow Bullets" would be one of the finalists in that category. I was pleased, but not without some obvious misgivings. I'd been unhappy about the inclusion of my story on the recommendation lists of the Sad and Rabid Puppies, especially given that the latter was to all intents just another slate, designed to encourage block voting. At the time no one really had a clear idea about how dominant the Puppy factor was going to be in this year's shortlists.

Trying to have my cake and eat it, I suggested to the administrators that I'd gladly accept the inclusion now, but that I might change my mind when I saw the extent to which Puppy choices had (or not) dominated the ballot. The best case I was realistically hoping for would be one or two obvious Puppy candidates showing up, but an otherwise fair selection which didn't show blatant signs of block voting. I'd had high hopes for Slow Bullets, after all. I considered it a strong story, and it had picked up enough positive reviews and recommendations throughout the year that it didn't seem beyond the bounds of possibility that it might make the ballot. That's not to say I was confident, but that just that the omens were about as good for that story as they had been for any of my recent pieces.

The adminstrators, quite reasonably, wanted a clearer, less ambiguous commitment from me. After a friendly and productive transatlantic phone call, I came around to the view that I'd not only accept the nomination, but take whatever came after it.

As several commentators have noted, the eventual ballots are quite strongly biassed in favour of Rabid Puppy choices. The unpalatable conclusion to be drawn from this is that my story, good as its chances were, probably wouldn't have made the cut were it not for the RP block vote. However, I didn't ask for those votes and in fact I expressly requested that my story not be slated. Kate Paulk (of the Sads) and Vox Day (of the Rabids) both declined my requests.

Since the announcement of the ballots, there's been quite a lot of discussion about the rights and wrongs of the finalists withdrawing their stories. Quite honestly, I'm very sympathetic to both sides of the debate. If I knew then what I know now, I'd probably have declined the initial nomination. But I didn't, and beyond that I made a commitment to the administrators not to withdraw at a later stage. On that basis alone, therefore, I'm keeping "Slow Bullets" on the ballot. I can't say I'm exactly over-joyed about this decision, though - from my point of view it just feels like the least worst choice of a very bad hand. Compare and contrast to the situation when my only other nomination happened, for "Troika", and my mood couldn't be more different.

Let's hope things are better next year.

Wednesday 20 April 2016

Pattern Recognition

Technology marches on. Sometimes I find myself caught entirely unawares by some capability which not only works flawlessly, but which has become almost freely available to the consumer or enthusiastic amateur. I still remember buying a digital SLR camera which had face recognition built in almost as an afterthought. I'd given it no great thought until - with the camera sitting powered-up on a coffee table - I noticed that it was locking in on the picture of the queen's head on a five pound note! It was a shock to realise that this supposedly futuristic image-processing functionality was not only highly robust, but so cheap to implement that it was barely mentioned in the camera's sales material.

I had a similar experience earlier today. After a long run of cloudy nights, I've finally been able to get outdoors with the telescope and attempt to continue my long-running adventure in astronomical imaging.

This being Spring, one of the obvious candidates is M13, the globular cluster in Hercules. It's easily visible in telescopes and not hard to find in binoculars. I've looked at it many times over the years, but only this week did I manage to get a picture of it.



I shot this using a telescope on a GoTo mount. The mount contains a computerised database, and provided it's set up reasonably well at the start of a night, the "kit" enables the telescope to automatically locate and track astronomical objects. So, although I can find M13 for myself, I didn't need to: I just typed "M13" into the handset and off it whirred. The pointing accuracy was such that the cluster ended up close to the field of view and there was no difficulty obtaining images,

M13's fairly easy game, though, and I fancied a stiffer challenge. High in the Spring sky, near the "handle" of the Plough, is M101 - a "grand design" spiral galaxy of quite exceptional beauty. It's also very faint and a reportedly tricky object to see by eye alone, even through a telescope. With light pollution and a bright moon, there was no chance of that - but I was still optimistic that I could obtain an image. After all, at least I didn't have to worry about the pointing part - the telescope and its GoTo mount could take care of that for me.

However, I didn't succeed. Here's one of several frames I took over two clear nights:


Fascinating stuff, eh. Some random stars and electrical noise, and no sign of anything resembling a spiral galaxy. Now, I mentioned that the GoTo mount does need to be set up properly at the start of the night - accurate polar alignment and all that - so there's always a possibility that it isn't quite aiming in the right direction by the time it thinks it's found M101. Compounding that, the digital camera that I attach to the telescope samples a smaller field of view, so the object of interest might be falling just outside the camera's reach.

It was then that an astronomer on Twitter, Andrew Gray, mentioned that there is such a thing as "plate solving", The idea is that a sufficiently powerful pattern matching algorithm can analyze an image of some stars and work out exactly where in the sky they correspond to. That's an incredible feat of computation, especially if you don't know in advance little niceties like angular scale, brightness calibration and so on. Now, I was distantly aware that something like this capability existed, but I had no idea it was now easily within reach of the amateur such as myself.

However, within minutes I had uploaded my frame to:

http://nova.astrometry.net/

And almost as quickly I had a set of annotations for my frame:


This is ASTONISHING. Not simply because it comfirms that M101 was indeed within my field of view - but presumably too washed out by glare to show up - but that this capability exists and is free for use, and returns results quickly enough to be useful for astronomers actually sitting at their telescopes, trying to find stuff. I didn't even have to submit the image in some fancy, astronomy-only image format, either - I just sent a Jpeg.

For the sheer hell of it, I also ran my M13 image through the processor:


I'm amazed, and impressed, and excited by the possibilities. Truly we are blessed to be living at a time when such miraculous feats of technology are easily within our grasp.

Just to end, here's a close-up of M13:


The light I caught had travelled 25,000 years to reach my telescope. If there's ever a day when that sort of thing doesn't send a shiver down my spine, please feel free to shoot me.