Quantum Superiority – Facts, Information

Quantum excellence is the ability of quantum computing devices to solve problems that classical computers practically cannot solve. The quantum advantage is the ability to solve problems faster. From the point of view of computational complexity theory, this usually means providing a superpolynomial acceleration compared to the most famous or possible classical algorithm. The term was popularized by John Preskill, but the concept of quantum computational advantage, especially in the simulation of quantum systems, goes back to the quantum computing proposal given by Yuri Manin and Richard Feynman.

Shor’s algorithm for factorizing integers, which runs in polynomial time on a quantum computer, provides such a superpolynomial speedup over the best known classical algorithm. While this remains to be proven, factorization is considered a difficult task when using classical resources. The difficulty of proving what cannot be done with classical computation is a common problem for unconditionally demonstrating quantum superiority. It also affects the Aaronson and Arkhipov boson sampling proposal, D-Wave’s specialized frustrated cluster loop issues, and output sampling for random quantum circuits.

Like the factorization of integers, the problem of sampling the output distributions of random quantum circuits is considered difficult for classical computers based on reasonable assumptions about complexity.

History

Google previously announced plans to demonstrate quantum superiority by the end of 2017 using an array of 49 superconducting qubits. However, as of early January 2018, only Intel has announced such hardware.

In October 2017, IBM demonstrated a simulation of 56 qubits on a conventional supercomputer, increasing the number of qubits required for quantum supremacy.

In November 2018, Google announced a partnership with NASA whereby NASA will “analyze the results from quantum circuits running on Google’s quantum processors and … provide comparisons with classical simulations to support Google in validating its hardware and establishing baselines. indicators for quantum excellence.

The theoretical work of 2018 suggests that quantum superiority can be achieved on a “two-dimensional lattice of 7 × 7 qubits and about 40 clock cycles”, but if the error rate is low enough.

On June 21, 2019, Scientific American suggested that quantum superiority would be achieved in 2019 under Neven’s Law.

On September 20, 2019, the Financial Times reported that “Google claims to have achieved quantum superiority on an array of 54 qubits, of which 53 were functional and were used to perform computations in 200 seconds, which would have taken a typical supercomputer about 10,000 years.” … The report originally appeared on the NASA website, but was removed shortly after publication. Later, on October 23rd, the achievement of quantum supremacy was officially announced. However, experts from IBM, after analyzing the data presented, indicated that some points are not optimal and that in fact such a calculation on a classic supercomputer will take only 2.5 days instead of the declared 10,000 years.

On December 3, 2020, Chinese scientists reported that their quantum computer Jiuzhang, which uses entangled photons, has achieved quantum superiority. In 200 seconds, they successfully computed a problem that would take more than half a billion years for the world’s fastest classical computer to solve.

Computational complexity

The question of complexity refers to how the amount of resource required to solve a problem scales with the size of the input. As an extension of the classical theory of computational complexity, the theory of quantum complexity describes the operation of a universal quantum computer without considering the complexity of its creation or removing the effects of decoherence and noise. Since quantum information is a generalization of classical information, a quantum computer can simulate any classical algorithm.

The complexity class of problems with quantum polynomial time and bounded error (BQP) is a class of solution problems that can be solved in polynomial time by a universal quantum computer. The BPQ class is related to the classical complexity classes by a hierarchy. Whether any of these conditions are correct is an open question.

Unlike decision problems, which require a yes or no answer, sampling problems require sampling from probability distributions. If there is a classical algorithm that can sample the output of an arbitrary quantum circuit, the polynomial hierarchy will move to the third level, which is considered very unlikely. BosonSampling is a more specific proposal, the classical complexity of which depends on the undecidability of the problem of computing the permanent of a large matrix with complex elements, which is a P # -full problem. The arguments used to obtain this statement have also been applied to IQP sampling, where only the hypothesis that the complexity of the problem is the same in the mean and worst cases is needed.

Super polynomial acceleration

The following algorithms provide superpolynomial speedup over the most famous classical algorithms:

Shor’s Algorithm for Factoring Integers

This algorithm finds the prime factorization of an n-bit integer in time, the most famous classical algorithm takes time and the best upper bound on the complexity of this problem. It can also speed up any task that boils down to integer factorization, including the problem of matrix groups belonging to odd-order fields.

For quantum computing, this algorithm is important both practically and historically. It became the first polynomial quantum algorithm proposed for a problem considered difficult for classical computers. The confidence in this complexity is so strong that the security of the most common encryption protocol RSA today is based on the factorization algorithm. A quantum computer that successfully and reproducibly runs this algorithm can break this encryption system. This hacking risk is called the Y2Q problem, similar to the Y2K problem in the year 2000.

Boson Sampling

This computational paradigm is based on identical photons sent through a linear optical network, it can solve certain problems with sampling and searching, which, taking several hypotheses, are unsolvable for classical computers. Nevertheless, it has been shown that boson sampling in a system with sufficiently large losses and noise can be effectively simulated.

The largest experimental implementation of boson sampling today has 6 modes, and therefore can process up to 6 photons simultaneously. The best classical boson sampling simulation algorithm runs in time of the order of magnitude for a system with n photons and m output modes. BosonSampling is an open source implementation of the R algorithm. The algorithm gives an estimate of 50 photons, which are required to demonstrate quantum superiority using boson sampling.

Sampling the output distribution of random quantum circuits

The most famous algorithm for simulating an arbitrary random quantum circuit requires time that scales exponentially with the number of qubits, based on this, one research group estimates that about 50 qubits may be enough to demonstrate quantum superiority. Google announced its intention to demonstrate quantum supremacy by the end of 2017 by creating and launching a 49-qubit chip that can sample allocations in a reasonable amount of time not available on any modern classical computer. But the largest simulation of quantum circuits successfully performed on a supercomputer has 56 qubits. Therefore, it may be necessary to increase the number of qubits required to demonstrate quantum superiority.

Criticism

Quantum computers are much more error prone than classical computers due to decoherence and noise. The threshold theorem states that a noisy quantum computer can use quantum error-correcting codes to simulate a noiseless quantum computer, assuming that the error introduced into each computer cycle is less than a certain number. Numerical simulations show that this number can reach 3%.

However, it is unknown how the resources required to fix the errors will scale with the number of qubits. Skeptics point out that the behavior of noise is unknown in scalable quantum systems as potential obstacles to the successful implementation of quantum computing and demonstrating quantum superiority.