As the technology of computing has evolved, the limitations of “classical” computers have grown more apparent. Many problems are so unwieldy they would take several times the age of the universe to solve. It’s becoming increasingly clear that more processing isn’t enough. Fundamentally different approaches to computing also are needed.

For the past few decades, several types of computing have been proposed and investigated including DNA computing, chaos computing and, perhaps the closest to becoming reality, quantum computing.

Last year Google, in a joint initiative with Universities Space Research Association and NASA, announced the creation of the Quantum Artificial Intelligence Lab (QuAIL). Its goal is to pioneer research into how quantum computing might help with difficult computer science problems.

Support structure for installation of the D-Wave processor. Source: NASA AmesSupport structure for installation of the D-Wave processor. Source: NASA AmesThe lab is located at NASA’s Ames Research Center in California’s Silicon Valley. QuAIL installed a D-Wave Two, a quantum computer that uses quantum annealing to find the ground state of an artificial Ising spin system. D-Wave Two was developed and manufactured by D-Wave Systems Inc.

Although an impressive machine, some controversy exists as to whether large-scale entanglement occurs in the D-Wave Two, bringing into question whether it truly is a quantum computer. Meanwhile, new technologies have emerged that promise to deliver a scalable, fully quantum computer.

Google took notice and has recruited quantum computing expert Dr. John Martinis, a professor of physics at the University of California, Santa Barbara (UCSB), and his quantum computing team to join QuAIL to help design and build its own quantum computer. Martinis will be a joint employee of Google and UCSB. He will be located in Google’s Santa Barbara office and will continue to use UCSB’s fabrication and measurement facilities. This work will be in addition to Google’s continued collaboration involving the D-Wave Two.

Google has good reasons for getting involved in quantum computing. One of the first identified potential applications of quantum computers was the prime factoring of large numbers, which could be used to break widely used cybersecurity protocols. This attracted the attention and financial support of the U.S. government and spurred much of the early development in quantum computing.

But quantum computers are capable of much more. Ultimately, quantum computers may be best suited for applications that exploit its inherent parallelism, described in more detail below. Such complex problems can be found in drug discovery, proteomics, logistics, many parameter optimization problems and machine learning, among others. Popular applications that range from e-mail spam filters and online-shopping suggestions to smart power grids will benefit from the development of quantum computers.

If Google succeeds, it would be the gatekeeper for solving what otherwise are intractable problems for traditional computers.

The fact that quantum computers have so many uses isn't surprising. After all, it was the limitations of classical computers that first led to the idea of quantum computing. More than 30 years ago, a group of physicists and computer scientists held a conference on the Physics of Computation at MIT. Among the points to come out of the meeting was that there were many problems in science that could not be calculated by classical computers in an efficient way. The broad outline of a computer based upon quantum mechanics was proposed.

In 1982, Paul Benioff, a physicist at the Argonne National Laboratory near Chicago, theorized the creation of a Turing machine under quantum formalism. A Turing machine is a way to model computations, meaning that Benioff was laying the groundwork for quantum computation that was not based upon the hardscrabble world of zeros and ones, but on the quirky indeterminism of quantum phenomena.

Much of the next two decades saw a fleshing out of the theoretical aspects of quantum computing including the first description of a universal quantum computer, the investigation of entanglement as a means of manipulating qubits (quantum bits) and the identification of problems best suited for quantum computing. By the late 1990's government funding of quantum computing grew quickly due to its potential cryptography applications. This led to the first experimental demonstration of a quantum algorithm in 1998 by a team at Oxford University on a 2-qubit NMR quantum computer. In the years since, quantum computing has seen a flurry of innovations and enhancements both in theory and technology.

The fundamental difference between traditional computers and quantum computers lies in the difference between bits and qubits. Classical computers utilize binary digits, or bits, to represent information. One bit can have one of two unique values, 1 or 0. Two bits can represent four possible unique values: 00, 01, 10 and 11. The more bits there are, the more unique values that can be represented with those bits. In general, there are 2N values for N bits so, for example, a byte--which is 8 bits--can represent 256 unique values.

By contrast, a quantum computer uses quantum binary digits, or qubits, to represent information. Unlike the familiar notions of 0 and 1 found in the classical computer, things are a bit less certain in the quantum world. That's because a qubit is in a superposition of states when it is not being measured. This means that, until it is measured, it has some 0 and some 1 and a parameter associated with each.

Add another qubit through a process called entanglement and the classical equivalent of 4 bits worth of parameters is achieved. This is because classical bits only have one parameter per bit. In fact, N qubits are the equivalent of 2N bits of information. These additional parameters (a sort of intrinsic parallelism from being many things at once) give quantum computers their power. The number of quantum operations needed for particular computing problems are exponentially smaller than needed for classical operations. This issue of scale is why they have been pursued so aggressively during the past 30 years.

Also worth noting is that in a classical computer, binary digits are represented by voltages. A zero voltage corresponds to a 0 and 3.3 or 5 volts (depending on the system) represents 1; literally on-off. So what can be used to represent qubits in the real world? The past two decades of research have found many ways to construct qubits including cold atoms or ions, spins in semiconductors and superconducting circuits.

Surface code threshhold where much of Google's work is focused. Source: UCSBSurface code threshhold where much of Google's work is focused. Source: UCSBMartinis' group at Google is using the superconducting circuit approach. This couples nanofabricated superconducting electrodes through Josephson junctions. A Josephson junction consists of a thin non-superconducting material between layers of superconducting materials. The key to the junction is that a super-current can cross the barrier without resistance.

However, when a critical current is exceeded, some normal current flows and a voltage appears across the junction, oscillating at 500 gigahertz per millivolt. Detecting and measuring the change from one state to the other is at the heart of Josephson Junctions, as described by Martinis and colleagues in the article “Superconducting quantum circuits at the surface code threshold for fault tolerance” published this year in the journal Nature.

In order to create a working quantum processor, a means must exist to store the quantum information, detect and correct errors and read the qubits. The Martinis group achieves this using a circuit constructed of aluminum on sapphire cooled to 20 millikelvin. Five transmon qubit variants called Xmons are the qubits. Attached to each of the Xmons are coplanar waveguide resonators used for individual state readout. Control wiring is used to detect and correct errors in the entangled qubits.

Since this approach to quantum computing involves integrated circuit fabrication technology, it is scalable, with thousands and perhaps millions of qubits within reach in the next decade. Presumably, this is why Google tapped Martinis and his group. A quantum state is relatively delicate and easily disturbed by interactions with the outside world. The group’s progress with respect to quantum error correction has, perhaps for the first time, made quantum computing mostly just a matter of cost and scaling.

Google’s participation in QuAIL and experience with D-Wave was just the beginning of what seems to be a commitment to developing a quantum computer. Martinis even mentioned that some of the innovations developed by D-Wave may be used to help in the construction of his group’s quantum computer. As the world becomes increasingly data driven, there is a clear gap between what is needed and what can be done with classical computing. Google understands this and appears focused on closing that gap with a quantum leap.