Thought your laptop was fast. Well try to imagine a computer that is 100 million times faster. This leap into binary hyper-drive is now considered a possibility with the construction of a huge quantum computer that could be just a few years away.

Such a development could potentially make Deep Thought, the super computer in the Hitch Hikers Guide To The Galaxy, redundant. It may even finally prove that the answer to life, the universe, and everything is 42 but it certainly wouldn’t need 7.5 million years to do it like Deep Thought did.

For a quantum computer to work it has to have circuits that can operate at On and Off states simultaneously. This is based on the laws of quantum mechanics which allow very small particles to exist in a number of “superposition” states until they are observed or disturbed.

While a normal computer has bits made up of zeros and ones, a quantum computer has quantum bits (qubits) which can take on the value of zero or one or both at the same time.

The qubit is the unit of quantum information. In a conventional computer a bit has to be in one of two states. Quantum mechanics however allows the qubit to be in a superposition which means it can be in both states at the same time. According to quantum law the particle then enters a superposition of states. Therefore each qubit adopts a superposition of both 0 and 1. So the number of computations that a quantum computer could handle is 2^n where n equals the number of qubits employed. Therefore a quantum computer consisting of 500 qubits would have a capability to perform 2^500 calculations in one operation which is a gigantic figure.

A team at the University of Sussex has created a practical blueprint for constructing a giant quantum computer and a proof-of-concept prototype is planned within two years.

The new concept from the team at the University of Sussex introduces electronic field connections that allow charged atoms, or ions, to be transported from one module to another.

However, some examples of quantum computing already exists. Only recently, Google and NASA found that a D Wave 2 system with 1,097 qubits outperformed existing supercomputers by more than 3,600 times and personal computers by 100 million times.

In fact, in January this year D-Wave Systems announced the availability of the D-Wave 2000Q (qubits) quantum computer. The new system continues the company’s ethos of doubling the number of qubits on its quantum processing units (QPUs) every two years. Using benchmark problems from real-world applications, the D-Wave 2000Q system outperformed highly specialised algorithms run on state-of-the-art classical servers by factors of 1,000 to 10,000 times.

For these tests D-Wave developed efficient CPU and GPU-based implementations of highly specialised algorithms that are recognised as the hardest competition to D-Wave QPUs.

These were run on the latest generation normal computer servers. These benchmark problems in sampling and optimisation were created to represent the structure of common real-world problems, while maximising the size of the problem that could fit on the 2000qubit QPU. The benchmark comparisons were relative to single CPU cores and 2500-core GPUs at the largest problem size.

The results were that the D-Wave 2000Q system outperformed the GPU-based implementations by 100 times in equivalent problem solving performance per Watt. This is important because power efficiency is a critical parameter in large-scale computing. The power draw of D-Wave’s systems has remained constant in successive generations, and is expected to continue to do so while the computational power increases. As a result, the computational power per Watt is expected to increase rapidly.

### By Paul Whytock

Paul Whytock is European Editor for Electropages. He has reported extensively on the electronics industry in Europe, the United States and the Far East for over twenty years. Prior to entering journalism he worked as a design engineer with Ford Motor Company at locations in England, Germany, Holland and Belgium.