No, your smartphone doesn’t take better pictures than the Hubble

Hubble headline

Guardian headline to the contrary, the Hubble space telescope is not inferior to a smartphone camera. If you want to skip my lengthy reasons below, it’s because your smartphone takes pictures of brightly lit scenes, in a friendly environment, that are meant for showing your friends on Facebook. That’s not what Hubble does, and if your smartphone camera tried to do what Hubble does, it would fail horribly.

To be fair, the article is accurate about an important point: the gear that goes into space is seldom cutting edge. It is hard to service, repair or upgrade once it is up there; and it was usually designed-in years before launch, which may have been as long as decades ago.

But the headline is going to be the only thing most people read, and it has important negative consequences. It makes people think that the scientists at NASA and other space agencies are too dumb to use smartphone cameras. Too bureaucratic. Too slow. Too tied up spending billions of “hard earned tax payer dollars” on more-expensive-but-less-good gadgets. All for some low quality pictures?

Let’s take a look at the ‘real picture’, shall we?

It’s all about the photons. Cameras work by gathering photons, focusing them, and capturing them on an imaging plate. All else being equal, not enough photons = grainy picture. As you know, a smartphone camera will take a brilliantly crisp photo outdoors on a sunny day. Indoors is OK if all the lights are on, but usually low quality if even a little bit dim.

The New Horizons probe is taking pictures of Pluto as I write. Pluto has a weird orbit, but it closer to the Sun than it is most of the time: about 33x times are far away as Earth is. That means that the Sun will be casting about 1100x LESS light out there than it does at Earth’s distance. New Horizons needs to capture the reflected photons to take pictures, and that’s much harder than it is in your living room. And the Hubble space telescope is normally used to take pictures of astronomical objects that are millions or even billions of times fainter than the surface of Pluto. The reason that so many space pictures look bad is the objects are really far away and there’s not much light. Imaging a 30th magnitude galaxy is a tough problem: you’re looking at objects that are many orders of magnitude fainter than the random noise in a smartphone camera!

Space is a rough neighbourhood. Smartphones only work between -10C and +40C, or about 260K (Kelvin, which is like Celsius but with 0K being absolute zero, or -273C) to 310K. Depending on the space mission, likely temperatures range from about 3K to 450K, which is a range about 9x larger than a smartphone camera could handle. G-forces on launch are much worse than dropping your phone onto the road. But the real issue is radiation: both cosmic rays and particles are absolute murder on imaging sensors.  Sensors are designed to sense things, so you can’t make them out of lead or seal them up in a perfectly shielded box. There are some people looking at using Commercial off the Shelf (COTS) cameras that are not radiation-hardened, but at this time almost everything in space is specially tailored for years of high radiation exposure. Your smartphone camera might last a week or even a month, but it wouldn’t last years. By the way, one of the world leaders in making space-ready cameras is Canada’s Dalsa. Now owned by US-headquartered Teledyne, they still do great work: they didn’t make the Hubble Main Field Camera (MFC), but they are on the Mars Surveyor, for instance.

This will get a little technical. There are just so many issues where smartphone cameras are NOT the same as space cameras that it’s hard to know where to start. Speed, resolution, and packaging are the minor issues. The sensor itself is MUCH larger on the Hubble MFC – you couldn’t just stick a tiny smartphone sensor in the optical path.

One MAJOR issue is that all smartphone cameras today use a kind of sensor called CMOS, while the main Hubble camera using something called CCD. Here is an excellent overview on some of the key differences between the two technologies. Both have their virtues: CMOS is low power, which makes it ideal for battery powered smartphones. CCDs use more power, but that’s not a problem when you have giant solar panels providing kilowatts whenever you need it. As one minor(ish) issue, CMOS uses something called a rolling shutter, while CCD uses a global shutter, and for most science/space imaging tasks a global shutter is much better. That’s not even the worst problem with CMOS cameras: they don’t produce images the same way CCDs do. They need to do some level of on board processing to make the pictures look good, which produces good selfies, but reduces the scientific usefulness of the images. Astronomers want the “raw data” and CMOS sensors don’t have that as their output. Also, CMOS cameras tend to have low dynamic range: they are less good at capturing really bright areas and really dark areas (and you get a lot of that in space!) in the same image, and that is bad for spatial resolution.

But the other issue is what we mean by “taking pictures.” A smartphone camera does a good job capturing light in the visible spectrum, which has wavelengths of roughly 400-700 nanometers, as seen in the image below. But the Hubble ‘cheats’: it is specially designed to capture the full visible spectrum, but also can image the longer wavelength infrared light and shorter wavelength ultraviolet light (the latter is particularly useful in space.) Remember the photons? Being able to ‘see’ photons from a broader spectrum gives you more photons to work with, as well as providing other scientifically useful information. Smartphones don’t do that. As an example, the image at the top of the Guardian article is of the Horsehead Nebula. Pretty, isn’t it? But that image is not from the visible light a smartphone camera can capture, it is infrared. In visible light, the nebula’s just an opaque black mass!

VisibleLightSpectrum2

Processing is not done onboard. One of the odd things about the Guardian article is how the writer keeps talking about processor statistics: “The third servicing mission was in 1999 and that was when the processor was last upgraded, from 1.25MHz to 25MHz, still way below the specifications we are familiar with today.” True enough, but it shows a serious misunderstanding of how the Hubble is doing something very different from a smartphone. We love our smartphones, and we love what they do to our pictures: they compress them, encode them, add filters, and modify them for transmission over cellular or Wi-Fi. These are all processor-intensive tasks, and the Hubble just couldn’t keep up.

But the Hubble isn’t about Instagram filters. It is a scientific mission, and its job is to take the best pictures possible, and then send them down to Earth as accurately as possible. There may be some error-checking on board, but no filtering, compressing, or any of the other things we expect from smartphones. Therefore 25MHz processors are not a gating factor on the picture quality.

Not that NASA is perfect. Don’t get me wrong. Using commercial off-the-shelf equipment where feasible is a great idea. It will save money and get new technology into space faster than in the past. But bad science stories about smartphone cameras being better than Hubble don’t help.

_________________________________

In other words, the Guardian headline and article is not merely slightly inaccurate, it is entirely backwards: smartphone cameras would NOT take better pictures than the Hubble.

Perhaps you doubt me? Perhaps you think it’s just my opinion and research against the author’s?

But it’s isn’t just my opinion. I wrote most of the above, and then sent a copy to my friend Brian Piccioni. He’s a tech analyst who was #1 ranked in Canada and globally multiple times, and specialised in imaging companies, graphics and smartphone chips, and made a few suggestions to improve the article. I haven’t seen my university friend Dave Kary since 1987 at UBC, but we’re in touch on Facebook. He also helped me sharpen up the article…and Dr. David Kary is an award winning astronomy professor at Citrus College in California. Finally, Dr. Savvas Chamberlain (Ph.D., M.Sc., D.Eng., FRSC, FIEEE, FCAE, FEIC, C.M. and Member of the Order of Canada) is someone I have known for years, and I had the honour of moderating a panel he was on at a semiconductor conference last year. Not only the former CEO of Dalsa, Savvas has published more than 150 papers on image sensors, CCDs and other semiconductor devices, and authored and co-authored 20 patents related to image sensing.

I am responsible for the final product, of course. But I did want to share the level of research and review that went into this analysis.

Advertisements

Tags: , , , , , ,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: