New research published out of the Max Planck Institute of Quantum Optics is one of the best examples of quantum computing beginning to flirt with practical technology.
Tech-buffs, investors, IT industrialists, and boffins alike eagerly await the day when the science of quantum computing yields practical technology. Physicists of the Max Planck Institute of Quantum Optics (MPQ), recently published research that, they believe, has brought that pivotal day closer.
For many years, physicists have sought to create an information network far superior to today's by exploiting quantum phenomena. The team of German researchers have constructed the first vital component of such a network: a link between two atomic nodes over which information can be received, sent, and stored using a single photon. Successful exchanges of information recently took place in Garching, Germany, between two MPQ labs connected by a 60-meter fiber-optic cable. Though only a prototype, this rudimentary network could be scaled up to more complex and distanced quantum networks. The team reports their research in Nature.
The idea of quantum computing was introduced by the physicist Richard Feynman in 1982. The essential unit of classical computing, the bit, is binary. Like a light switch, it's either on or off, 1 or 0. The quantum bit, by contrast, can be 1, 0, or a mix of both states – this last state being like a flipped coin that's still spinning in the air.
The usefulness of this extra dimension seems, at first pass, more confusing than anything else, but it actually creates an new opportunity to represent data. Whereas a trillion classical bytes can hold 243 discreet on/off values, a mere 200 quantum bits, or qubits, could represent at the very least 2200 discreet values. This new capacity would allow future computers to do involved calculations at nearly unthinkable speeds, and solve problems that are currently unsolvable. The technological implications are too many to list, which suggests why there's such excitement surrounding the field. [Editor's note: An earlier version got bits mixed up with bytes.]
What's more, this excitement has spiked over the last half-decade. Quantum computing has been theoretically realized for thirty years, and droves of physicists have been busy researching and proffering new quantum algorithms, novel mediums for information storage (such as diamonds or buckyballs), cryptographic techniques, unique logic gates, and many more applications. All for a computer that does not yet exist.
But recent advances suggest that the inception of such a computer is closer than many have thought. This was the crux of a recent New York Times article that detailed new improvements IBM has made in honing quantum computing – specifically, IBM researchers sped up computation and increased the lifetime of certain qubits, which tend to be unstable. It's usually a good sign for an emerging technology when the in-house applied research arm of a major tech company, which tend to invest their time and resources very conservatively, conveys optimism about its application.
The MPQ team has developed perhaps one of the most distilled and versatile permutations of of this technology. It implements two lone rubidium atoms as the nodes of the network. The qubit of information, stored as the quantum state of the one of the atoms, can be transfered through an emitted photon – which carries the quantum information – and is absorbed by the other rubidium atom. Over the winding fiber-optic cable, information is transmitted, received, and stored. The process is also completely reversible.
Send, read, write, save. The essential functions of networked computing can now be demonstrated in a system of just two atoms.
To heighten the chances of interaction between the photons and the rubidium atoms, which normally almost never occurs, the physicists designed "optical cavities," mirror-lined pockets that direct and then continually redirect the photon through the rubidium.
The experiment also demonstrates the possibility of something particularly brow-raising in the field of quantum computing: entanglement. Quantum entanglement occurs when particles interact physically, correlate their quantum states, and are then separated. The result is that a manipulation or measurement (which at the quantum scale are the same thing) of one quantum state affects the other. Hence they are "entangled." For example, measuring the spin as clockwise in particle A will spontaneously render the spin of particle B counterclockwise, regardless of whether particle B is six feet, six miles, or six light-years away. The distance separating the two does not matter. If you find this disconcerting, you're in good company. Albert Einstein, who could never quite accept the capricious nature of quantum physics, once decried this phenomenon as "Spukhafte Fernwirkung" or "spooky action at a distance."
It is impossible to predict the many implications of a viable quantum computer composed of a few hundred qubits, let alone the possibilities of technology derived from quantum entanglement – a concept about which many physicists are still puzzled. The MPQ team has openly acknowledged that this prototype can be improved markedly (their current success rate for transferring the quantum states is 0.2 percent). But consider the progress it marks since 1982, when quantum computing was a mere idea, and most definitely not an elegant experiment running the distance between two German labs.