A quantum computer may have unraveled a problem in seconds that would take the quickest conventional supercomputer for further than 10,000 years. A draft of an article by Google researchers laying out the achievement trickled in recent days, blasting an avalanche of news coverage and assumption.
While the study has not yet been peer-reviewed—the final version of the paper is predicted to appear soon—if it all reviews out it would affect the initial computation that can only be conducted on a quantum processor.
That sounds remarkable, but what does it imply?
Quantum computing: the initials
To comprehend why quantum computers are a huge deal, we need to go back to traditional, or digital, computers.
A computer is a machine that takes an input, reaches a sequence of instructions, and generates an output. In a digital computer, these intakes, instructions, and outputs are all progressions of 1s and 0s (individually called bits).
A quantum computer does a similar thing, but it utilizes quantum bits or qubits. Where a bit takes on only one of two values (1 or 0), a qubit utilizes the complex mathematics of quantum mechanics, giving a richer set of possibilities.
Building quantum computers seizes phenomenal engineering. They must be separated to ensure nothing interferes with the neat quantum states of the qubits. This is why they are maintained in vacuum chambers comprising fewer particles than outer space, or in refrigerators colder than anything in the universe.
But at the similar time, you need the means to interact with the qubits to accomplish instructions on them. The problem of this balancing act implies that the size of quantum computers has thrived slowly.
However, as the quantity of qubits connected together in a quantum computer thrives, it becomes exponentially more difficult to imitate its behavior with a digital computer.