This article first appeared in Personal Computer World magazine, Frbruary
We've reported several times on developments in quantum computing the use of subatomic particles for storing data and performing massively parallel computations. But just when we think we've covered the latest development, something new always pops up. According to the latest research from the University of Michigan, it's possible to store a database as large as the British Library in wait for it a single electron.
The humble electron is almost unimaginably small you could pack one thousand million in a millimetre and inhabits that mysterious quantum realm where every "particle" leads a weird double life it's also a wave. Particles are fairly easy to picture: they're little blobs of "stuff", and we can imagine them bouncing around and clumping to form atoms and molecules. But waves? Waves of what, exactly? Here's where it begins to get strange. The waves are purely mathematical constructs, representing probabilities. Yet they have some kind of physical reality too.
Until recently, quantum computing researchers have treated fundamental particles as "particles" with an associated spin. A given electron can spin "up" or "down", so up can represent a 0, and down a 1 (or vice-versa, it doesn't matter). But, and this is perhaps the biggest "but" in all of quantum mechanics, until it's actually measured, an individual electron is simultaneously in both states it's somehow spinning both left and right (don't try to visualize this it's impossible). Only when you measure the spin does it become left or right. What this means is that if, before you measure it, you can persuade an electron to take part in a computation, you get two results for the price of one: you get the result of the electron representing a 0 (spinning up) and as a 1 (spinning down). Do this with a bunch of electrons and you have massively parallel computation. This much has already been demonstrated in the laboratory.
But the latest research is focusing on the wave-like nature of the electron. Physicist Philip Bucksbaum of the University of Michigan is storing data in the waves associated with a single electron, encoding a string of 0s and 1s. In the initial experiments, he's working with a single caesium atom, excited by short bursts of laser light. After exposure to the light, the overall energy of the atom is increased, sending its electrons to higher-energy states. It's these states that can be used to encode binary data. The theory was worked out in 1997 by quantum computing pioneer Lov Grover at Bell Labs, but this is the first time it's been demonstrated to work in the lab.
At the moment, Bucksbaum is only storing 8 bits in an electron, but according to theory there's no practical limit to the amount of data that could be stored. He's refreshingly honest about his team's experiments: "It is important to keep this study in perspective," he says. "It's a new concept. Most researchers are using the spin of a quantum particle as a storage medium. Our work may turn out to be a step on the pathway to a viable quantum computer system or it could be a complete dead-end. The field is still too new to know which approach will succeed."
Today, quantum computing remains firmly in the research labs, at the "proof of concept" level, and there are formidable practical difficulties to overcome like making it all work at anything approaching room temperature. But if it ever goes mainstream watch out. A single program run by a single quantum computer will pack more processing power than the sum of all the conventional computers ever built.
Toby Howard teaches at the University of Manchester.