This article first appeared in Personal Computer World magazine, September 1996.
THERE IS NOTHING inherently electrical about computing. In conventional computers, 0s and 1s are represented by the presence or absence of particular voltage levels, but working machines have been constructed using technologies as diverse as fluid flow or beams of light. Researchers seeking the ultimate in miniaturisation are now investigating the use of individual sub-atomic particles. The possibilities are as astonishing as they are powerful. Prepare to have your mind boggled by the Quantum Mechanical computer.
We expect the things around us to possess certain definite properties, whether or not we happen to be actually observing those properties. Suppose I start a toy top spinning. If I ask you to tell me whether it is spinning clockwise or anticlockwise, you would need to examine it, and what you would discover is the way it had been spinning since I started it. It would still have been spinning that way whether you'd looked at it or not.
Now consider a sub-atomic particle such as an electron, spinning about an axis. We would expect that at any time, like the toy top, the particle must be in one of two states: either spinning clockwise, or spinning anticlockwise. Not so, according to the laws of Quantum Mechanics, the physics which describes the micro-world. Until we actually make a measurement of the particle's spin, it actually has no definite spin at all (see illustration). Rather, it is in a 'superposition' of its two possible spin states. All we can say is that there is some probability that its spin, when measured, will turn out to be clockwise or anticlockwise. Until the measurement is actually made, the particle, in some bizarre sense, is actually spinning in both directions at once!
Such weird behaviour would seem to rule out using the particle to represent a binary digit, but there is a surprising advantage lurking here. Let us say that our 'quantum bit', or 'qubit', represents a 0 if it is found to spin clockwise, and a 1 if anticlockwise. Now, imagine that we excite the qubit with suitable laser light, such that its spin is an equally probable superposition of clockwise (0) and anticlockwise (1). Imagine further that we take, say, 32 of these qubits and assemble them side-by-side to make a register. In a conventional computer, a 32-bit register can store a single number in the range 0 to 232 - 1. However, because each of our qubits is in a superposition of 0 and 1, every one of the four billion possible integers between 0 and 232 - 1 is somehow simultaneously present in the register.
Now comes the clincher. If we perform a computation (such as 'add one') on the register, the computation will actually be performed simultaneously on all the possible numbers in it, leaving the register containing all the corresponding results. We get parallellism for free, on a massive scale, from just a single register. There must be a catch here, you're thinking.
There is. Although the register contains a superposition of all the results of the computation, to examine the register we must perform a measurement on each qubit, which then become 0s or 1s. We are left with a single 32-bit number, chosen at random from all the results of the computation. It would appear that we haven't gained anything: although 4 billion computations have been performed simultaneously, we see the result of only one of them, and a randomly chosen one at that.
However, there are mathematical algorithms which involve performing computations just like this, and which are therefore ideally suited to quantum parallelism. Peter Shor, of AT&T Bell Laboratories, has devised such an algorithm for finding the factors of large numbers. Even the fastest known conventional factorisation methods are so compute-intensive that the problem is effectively insoluble on the largest modern computers. For example, a 1994 experiment to factor a 129-digit number took eight months of solid computation using 1600 cooperating workstations on the Internet. According to Shor, a quantum computer would find the answer in a few seconds.
This has an important implication for public-key cryptography, which depends for its security on the practical computational impossibility of factoring numbers with around 250 digits. Public-key cryptography is currently considered 'unbreakable', and therefore widely used in financial and military systems, as well as in the Pretty Good Privacy (PGP) program popular with Internet users. With Shor's factorisation algorithm running on even a simple quantum computer, all the world's 'secure' cryptography might disappear overnight.
For now, quantum computers exist only on paper, and there remain formidable theoretical and engineering barriers to their construction. But perhaps the technology will eventually arrive on our desktops. Accompanied, no doubt, by a copy of Personal Quantum Computer World.
Toby Howard teaches at the University of Manchester.