What makes Quantum Computing disruptive?

This blogpost examines the differences between classical and quantum computing, and the disruptive potential of quantum computers across industries.

October 23, 2017
• by
Vivek Sharma

In just the last 4 weeks, here are some of the public announcements regarding Quantum computing: A 17 Qubit test chip developed by Intel; the largest ever molecular simulation through Quantum computing by IBM; and a $10 Billion investment in Quantum computing by the Chinese government. As per The Wall Street Journal, “A reliable, large-scale quantum computer could transform industries from AI to chemistry, accelerating machine learning and engineering new materials, chemicals and drugs .... If this works, it will change the world and how things are done!” We already have insanely fast chips and supercomputers commercially available, so why is smart money betting on quantum computing? This blogpost explains the what and why of quantum computing, and its disruptive potential.  

Let’s start with classical computing. Classical computing is based on a transistor which, in very simple terms, is a switch with an output of binary 1 when switched on, and binary 0 when switched off. Information is processed in computers on these transistors in various sequences of 0 and 1. The net computing power is a function of the number of transistors in the circuit and follows the famous Moore’s law - that the computing power doubles every 18 months. The Intel 4004 processor in 1971 had only 2300 transistors, which increased to 5.5 million transistors on Pentium Pro in 1995, to an astonishing 7.2 billion transistors on 22-core Xeon Broadwell-E5 processor in 2016. 

But transistor miniaturization is now reaching a physical limit and the size of a smallest transistor on a commercial chip is 10 nanometers, about the size of a few atoms. At that miniaturization, electrons in a transistor circuit have the risk of transferring to the other side of the barrier, through a process called quantum tunneling. Quantum computing solves this challenge by not using transistors.

Instead, quantum computers use the spin orientation of elementary particles (e.g. electrons) as the basis of computing - a spin in one direction is assigned a value of 1 and that in the other direction a 0 value. Counterintuitively, in the quantum world, an elementary particle can be in both states at the same time due to a phenomenon called superposition. This allows us to encode information in Qubits (quantum bits) which can represent either 0 or 1 or any combination of 0 and 1. For a 2-bit data system, a 2-bit classical computer would need to conduct 4 steps, one each for the 4 possible states of 00, 01, 10, 11. In comparison, a 2 Qubit quantum computer contains information for all 4 states simultaneously, before measurement, and needs to conduct only one step.

Extending the above logic, for a “n” bit data system, a classical computer needs to conduct 2n times computations for every single computation for a n-bit quantum computer. A 30-Qubit quantum computer will have computing power equivalent to 10 teraflops classical computer (one trillion floating-point operations per second). This has enormous commercial and scientific applications for searching large datasets, where quantum computing will take square root of the time taken by classical computing. Quantum computing can also enable simulations of complex molecules for drug development, real time rebalancing of investment portfolios and optimization of logistics networks even in peak demands.   

With that much opportunity, why are Quantum computers not already available commercially? One big reason is the extreme requirements under which elementary participles can be manipulated for quantum computing. It is critical to isolate the elementary particles used for quantum computing, as any entanglement with surrounding elementary particles can lead to leakage of information. This isolation requires reduction in the energy levels of the elementary particles by cooling them to 20 millikelvin (about -459 degrees Fahrenheit). Elaborate infrastructure needed to achieve that is currently available only in universities or research arms of leading technology firms.

That challenge feels like deja-vu for those familiar with history of computing. The computers in 1970s were almost the size of our classroom, while exponentially faster computers today fit in the palm of our hands. As investment pours into Quantum computing, it will follow the same path to miniaturization and commercialization. The key milestone in that journey would be building a 50 Qubit computer, also called ‘quantum supremacy’, when classical computers will not be able to match quantum computers. Ironically, the most immediate downside of quantum supremacy will be online security. Most internet encryption and authentication is done through public key cryptography, which is essentially a mathematics problem too complicated for a classical computer to solve. But a hacker with a 50 Qubit computer can solve it relatively quickly!