Why the age of quantum computing is nearer than you think
New research published out of the Max Planck Institute of Quantum Optics is one of the best examples of quantum computing beginning to flirt with practical technology.
Tech-buffs, investors, IT industrialists, and boffins alike eagerly await the day when the science of quantum computing yields practical technology. Physicists of the Max Planck Institute of Quantum Optics (MPQ), recently published research that, they believe, has brought that pivotal day closer.Skip to next paragraph
Subscribe Today to the Monitor
For many years, physicists have sought to create an information network far superior to today's by exploiting quantum phenomena. The team of German researchers have constructed the first vital component of such a network: a link between two atomic nodes over which information can be received, sent, and stored using a single photon. Successful exchanges of information recently took place in Garching, Germany, between two MPQ labs connected by a 60-meter fiber-optic cable. Though only a prototype, this rudimentary network could be scaled up to more complex and distanced quantum networks. The team reports their research in Nature.
The idea of quantum computing was introduced by the physicist Richard Feynman in 1982. The essential unit of classical computing, the bit, is binary. Like a light switch, it's either on or off, 1 or 0. The quantum bit, by contrast, can be 1, 0, or a mix of both states – this last state being like a flipped coin that's still spinning in the air.
The usefulness of this extra dimension seems, at first pass, more confusing than anything else, but it actually creates an new opportunity to represent data. Whereas a trillion classical bytes can hold 243 discreet on/off values, a mere 200 quantum bits, or qubits, could represent at the very least 2200 discreet values. This new capacity would allow future computers to do involved calculations at nearly unthinkable speeds, and solve problems that are currently unsolvable. The technological implications are too many to list, which suggests why there's such excitement surrounding the field. [Editor's note: An earlier version got bits mixed up with bytes.]
What's more, this excitement has spiked over the last half-decade. Quantum computing has been theoretically realized for thirty years, and droves of physicists have been busy researching and proffering new quantum algorithms, novel mediums for information storage (such as diamonds or buckyballs), cryptographic techniques, unique logic gates, and many more applications. All for a computer that does not yet exist.