This could be a new lease of life for ground based telescopes. The new technique is called “adaptive optics”. Using a far more sensitive light sensory chip, Cambridge and Caltech have developed a cure for the problems which plague the big observatorie
The BBC has been kind enough to supply some comparative images. Interesting stuff, this. Palomar is a famous observatory, a 200 inch lens, but it’s like looking through the optometrist’s better/worse test routine. Lucky also uses better software to clear up things like atmospheric distortion. The system identifies and kills off the distorted images. This means an image of a galaxy 25,000 light years away is now a legitimate photo op.
Culturally, the Hubble Telescope, the Eighth Wonder of the World, has been the main source of astronomy’s pinups. That’s put quite a load on Hubble, having to carry the load for the inadequacies of the ground based observatories as well as work to its own strengths. With the new software, they could work very well in tandem, Hubble as surveyor, and the ground based equipment for the finer work, particularly the closer stuff.
Differentiation is also greatly improved by the new system. The telescopes can identify stars as little as one light day apart, which would have been impossible using conventional systems. As one of the astronomers points out, the ground based telescopes actually have better optical capacity than Hubble, but the atmospheric soup mix has been in the way.
This sort of detail matters in astronomy. Measures and ratios need accuracy, and visual information can be infuriatingly vague. I’ve been doing some galaxy classification at Oxford’s site, and really, anything that improves optical quality is vitally necessary. Some of those images are grainy like an oak table, some have extra light and general digital noise which is nothing less than maddening. I’ve seen some really dramatic things on that site, and things which look like tomato soup with serious social inhibitions.
This must have taken an awful lot of work. The sort of distortions they’ve been dealing with would require clear calibration, and there’s a lot of them. The software would have to deal with whatever the inputs might be. Sources of annoyance are potentially anything from basic pollution through light noise from cities, in whatever magnitude. Passing planes, satellites, inversion layers, there’s a dictionary.
This could also be an invaluable technique for photography as a whole. “Light” isn’t a constant value for anyone or anything trying to take a picture. Software which can figure out which is the better shot for reproduction, or has better digital values, is truly very useful.
One of the reasons for Pixel Rage in graphic art is that at a certain point the software gives up, and just assigns an arbitrary value. Even the thousands of colors available have to work with a grid, and that’s where the problems start. Sounds impressive, until you remember there are millions of pixels now in high quality digital imaging. Compared to natural light, it’s getting better, but the human eye is a bit harder to fool than that.
This system should be able to do light matching at a much better level of accuracy. Astronomy may well have solved a lot of other people’s problems as well.