INTRODUCTION

Quantum computing may well be the future of most high-end data centres. This is because, as the demand to intelligently process a growing volume of online data grows, so the limits of silicon chip microprocessors are increasingly going to be reached. Sooner or later it will also become impossible to miniaturize traditional computing components further and hence to continue to achieve year-on-year increases in computer power. Today, Intel's latest microprocessors are based on an industrial process that can produce transistors only 22 nanometres wide. Further advancements in this technology are still possible. But at some point miniaturization will hit a physical limit as transistors only a few atoms in size will simply not be able to function.


Enter quantum computing -- an emerging science that quite literally goes beyond the laws of conventional physics. Over the next few decades, quantum computing could be the next-wave development to deliver computer power well beyond current comprehension. Today, all of us increasingly cast digital data shadows each time we use the Internet, or even when we pass a CCTV or other camera linked into a vision recognition system. At present there is simply no way to process all of the data that every person on the planet produces. But as quantum computers arrive, the opportunity to do this may well arrive. Read on to learn more about quantum computing -- and/or watch my Explaining Quantum Computing video.


FROM BITS TO QUBITS

Conventional computers are built from silicon chips that contain millions or billions of miniature transistors. Each of these can be turned "on" or "off" to represent a value of either "1" or "0". Conventional computers subsequently store and process data using "binary digits" or "bits". In contrast, quantum computers work with "quantum bits" or "qubits". These are represented in hardware using quantum mechanical states rather than transistors that are turned "on" or "off". For example, quantum computers may use the spin direction of a single atom to represent each qubit, or alternatively the spin direction of a single electron or the polarization orientation of a photon. Yet other quantum computing designs supercool rare metals to allow qubits to be represented by the quantum spin of a tiny magnetic field.


Due to the peculiar laws of quantum mechanics, individual qubits can represent a value of "1", "0" or both numbers simultaneously. This is because the sub-atomic particles used as qubits can exist in more than one state -- or "superposition" -- at exactly the same point in time. By attaching a probability to each of these states, a single qubit can therefore process a wide range of values. In turn, this allows quantum computers to be orders of magnitude more powerful than their conventional, purely digital counterparts.


The fact that qubits are more "smears of probability" than definitive, black-and-white certainties is exceptionally weird. Flip a coin and it cannot come up both heads and tails simultaneously, and yet the quantum state of a qubit can in some senses do just that. It is therefore hardly surprising that renowned nuclear physicist Niels Bohr once stated that "anyone who is not shocked by quantum theory has not understood it!"


Another very bizarre thing is that the process of directly observing a qubit will actually cause its state to "collapse" to one or other of its superpositions. In practice this means that, when data is read from a qubit, the result will be either a "1" or a "0". When used to store potentially infinite amounts of "hidden" quantum data, qubits can therefore never be directly measured. This means that quantum computers need to use some of their qubits as "quantum gates" that will in turn manipulate the information stored and processed in other hidden qubits that are never directly measured or otherwise observed.


Because qubits can be used to store and process not just the digital values of "1" and "0", but also many shades of grey in between, quantum computers have the potential to perform massively parallel processing. This means that quantum computers will be very effective at performing tasks -- like vision recognition, medical diagnosis, and other forms of artificial intelligence processing -- that can depend on very complex pattern matching activities way beyond the capabilities of both traditional computers and most human beings.


QUANTUM COMPUTING PIONEERS

OK, so quantum computing may sound all very theoretical (and indeed at present a lot of it actually is!). However, practical quantum computing research is now very much under way. Perhaps most notably, back in 2007 a Canadian company called D-Wave announced what it described as "the world's first commercially viable quantum computer". This was based on a 16 qubit processor -- the Rainer R4.7 -- made from the rare metal niobium supercooled into a superconducting state. Back in 2007, D-Wave demonstrated their quantum computer performing several tasks including playing Sudoku and creating a complex seating plan.


Many people at the time were somewhat sceptical of D-Wave's claims. However, in December 2009, Google revealed that it had been working with D-Wave to develop quantum computing algorithms for image recognition purposes. Experiments had included using a D-Wave quantum computer to recognise cars in photographs faster than possible using any conventional computer in a Google data centre. Around this time, there was also an announcement from IBM that it was rededicating resources to quantum computing research in the "hope that a five-year push [would] produce tangible and profound improvements".


In 2011, D-Wave launched a fully-commercial, 128-qubit quantum computer. Called the D-Wave One, this is described by the company as a "high performance computing system designed for industrial problems encountered by fortune 500 companies, government and academia". The D-Wave One's super-cooled 128 qubit processor is housed inside a cryogenics system within a 10 square meter shielded room. Just look at the picture here and you will see the sheer size of the thing relative to a human being. At launch, the D-Wave One cost $10 million. The first D-Wave One was sold to US aerospace, security and military giant Lockheed Martin in May 2011.


D-Wave aside, other research teams are also making startling quantum computing advances. For example, in September 2010, the Centre for Quantum Photonics in Bristol in the United Kingdom reported that it had created a new photonic quantum chip. This is able to operate at normal temperatures and pressures, rather than under the extreme conditions required by the D-Wave One and most other quantum computing hardware. According to the guy in charge -- Jeremy O묪rien -- his team뭩 new chip may be used as the basis of a quantum computer capable of outperforming a conventional computer "within five years".


Another significant quantum computing milestone was reported in January 2011 by a team from Oxford University. Here strong magnetic fields and low temperatures were used to link -- or "quantumly entangle" -- the electrons and nuclei of a great many phosphorous atoms inside a highly purified silicon crystal. Each entangled electron and nucleus was then able to function as a qubit. Most startlingly, ten billion quantumly entangled qubits were created simultaneously. If a way an be found to link these together, the foundation will have been laid for an incredibly powerful computing machine. In comparison to the 128 qubit D-Wave One, a future computer with even a fraction of a 10 billion qubit capacity could clearly possess a quite literally incomprehensible level of processing power.


THE QUANTUM ROAD AHEAD

Quantum computing is a highly complex and bewildering field with incredible potential (though so too was microelectronics in the 1970s and we all now take that for granted!). For a far more technical overview of the topic, try reading this overview from Stanford University. You may also want to look at IBM's Quantum Computing pages, visit the Australian Centre of Excellence for Quantum Computation and Communication Technology, or browse-on-over to D-Wave's Technology Overview. Do be aware, however, that delving into any and all of these resources may well make your head hurt!


Ultimately, few companies and individuals will ever own a quantum computer. Nevertheless, within a decade or two most companies and individuals are very likely to be regularly accessing quantum computers from the cloud. Not least this is because one of the first mainstream applications of quantum computing will be in online security and data encryption. Today, all online security systems rely on prime number calculations that quantum computers are potentially very good at indeed. Fairly soon, anybody with a quantum computer will therefore theoretically be able to use it to crack the security on any bank account or cloud computing resource. The only way to prevent this will be to protect and encrypt all online resources with quantum security gateways. The demand for every bank and cloud provider to invest in a quantum computer -- if only for encryption purposes -- is therefore likely to skyrocket once the technology moves beyond its currently rather costly and cumbersome experimental phase. Almost certainly signalling the potential significance of quantum computing in code-making and code-breaking, in March 2012 the National Security Centre in the United States announced that it is spending $2bn on a highly-fortified data centre with a 512 qubit quantum computer.


Another major application area for quantum computing will be in the processing of Big Data. As the volume of digital data produced on Planet Earth continues to grow expotentially, so a significant potential exists to generate business and social value via its insightful interlinkage. While technologies like Hadoop are currently permitting advancements in the processing of vast data sets, it may well be the development of quantum computers that really pushes large-scale Big Data analysis into the mainstream.


For more information on quantum computing, you may like to watch my Quantum Computing Video. Information on a range of other future technologies can also be found on our sister site ExplainingTheFuture.com.



Video



References