top of page
  • tecqusition

Quantum Computing (PART-2)

HAVE A LOOK AT THE FIRST PART (Quantum Computing (PART-1) (wixsite.com)), TO GET A CLEAR VIEW. LET'S CONTINUE WITH PART - 2



Obstacles:

There are several technical challenges in building a large-scale quantum computer. Physicist David DiVincenzo has listed these requirements for a practical quantum computer:

  • Physically scalable to increase the number of qubits

  • Qubits that can be initialized to arbitrary values

  • Quantum gates that are faster than the decoherence time

  • Universal gate set

  • Qubits that can be read easily

Sourcing parts for quantum computers is also very difficult. Many quantum computers, like those constructed by Google and IBM, need Helium-3, a nuclear research byproduct, and special superconducting cables made only by the Japanese company Coax Co.


The control of multi-qubit systems requires the generation and coordination of a large number of electrical signals with tight and deterministic timing resolution. This has led to the development of quantum controllers which enable interfacing with the qubits. Scaling these systems to support a growing number of qubits is an additional challenge.



Quantum decoherence:

One of the greatest challenges involved with constructing quantum computers is controlling or removing quantum decoherence. This usually means isolating the system from its environment as interactions with the external world cause the system to decohere. However, other sources of decoherence also exist. Examples include the quantum gates, and the lattice vibrations and background thermonuclear spin of the physical system used to implement the qubits. Decoherence is irreversible, as it is effectively non-unitary, and is usually something that should be highly controlled, if not avoided. Decoherence times for candidate systems, in particular, the transverse relaxation time T2 (for NMR and MRI technology, also called the dephasing time), typically range between nanoseconds and seconds at low temperature. Currently, some quantum computers require their qubits to be cooled to 20 millikelvins to prevent significant decoherence. A 2020 study argues that ionizing radiation such as cosmic rays can nevertheless cause certain systems to decohere within milliseconds.


As a result, time-consuming tasks may render some quantum algorithms inoperable, as maintaining the state of qubits for a long enough duration will eventually corrupt the superpositions.


These issues are more difficult for optical approaches as the timescales are orders of magnitude shorter and an often-cited approach to overcoming them is optical pulse shaping. Error rates are typically proportional to the ratio of operating time to decoherence time, hence any operation must be completed much more quickly than the decoherence time.



As described in the Quantum threshold theorem, if the error rate is small enough, it is thought to be possible to use quantum error correction to suppress errors and decoherence. This allows the total calculation time to be longer than the decoherence time if the error correction scheme can correct errors faster than decoherence introduces them. An often-cited figure for the required error rate in each gate for fault-tolerant computation is 10−3, assuming the noise is depolarizing.

Meeting this scalability condition is possible for a wide range of systems. However, the use of error correction brings with it the cost of a greatly increased number of required qubits. The number required to factor integers using Shor's algorithm is still polynomial, and thought to be between L and L2, where L is the number of digits in the number to be factored; error correction algorithms would inflate this figure by an additional factor of L. For a 1000-bit number, this implies a need for about 104 bits without error correction. With error correction, the figure would rise to about 107 bits. Computation time is about L2 or about 107 steps and at 1 MHz, about 10 seconds.

A very different approach to the stability-decoherence problem is to create a topological quantum computer with anyons, quasi-particles used as threads and relying on braid theory to form stable logic gates.


Physicist Mikhail Dyakonov has expressed scepticism of quantum computing as follows:

"So the number of continuous parameters describing the state of such a useful quantum computer at any given moment must be... about 10300... Could we ever learn to control the more than 10300 continuously variable parameters defining the quantum state of such a system? My answer is simple. No, never."


Developments



Quantum computing models:

There are several quantum computing models, distinguished by the basic elements in which the computation is decomposed. The four main models of practical importance are:

  • Quantum gate array (computation decomposed into a sequence of few-qubit quantum gates)

  • One-way quantum computer (computation decomposed into a sequence of one-qubit measurements applied to a highly entangled initial state or cluster state)

  • An adiabatic quantum computer, based on quantum annealing (computation decomposed into a slow continuous transformation of an initial Hamiltonian into a final Hamiltonian, whose ground states contain the solution)

  • Topological quantum computer (computation decomposed into the braiding of anyons in a 2D lattice)

The quantum Turing machine is theoretically important but the physical implementation of this model is not feasible. All four models of computation are equivalent; each can simulate the other with no more than polynomial overhead.



Physical realizations

For physically implementing a quantum computer, many different candidates are being pursued, among them (distinguished by the physical system used to realize the qubits):

  • Superconducting quantum computing (qubit implemented by the state of small superconducting circuits [Josephson junctions])

  • Trapped ion quantum computer (qubit implemented by the internal state of trapped ions)

  • Neutral atoms in optical lattices (qubit implemented by internal states of neutral atoms trapped in an optical lattice)

  • Quantum dot computer, spin-based (e.g. the Loss-DiVincenzo quantum computer) (qubit given by the spin states of trapped electrons)

  • Quantum dot computer, spatial-based (qubit given by electron position in double quantum dot)

  • Quantum computing using engineered quantum wells, which could in principle enable the construction of quantum computers that operate at room temperature

  • Coupled quantum wire (qubit implemented by a pair of quantum wires coupled by a quantum point contact)

  • Nuclear magnetic resonance quantum computer (NMRQC) implemented with the nuclear magnetic resonance of molecules in solution, where qubits are provided by nuclear spins within the dissolved molecule and probed with radio waves

  • Solid-state NMR Kane quantum computers (qubit realized by the nuclear spin state of phosphorus donors in silicon)

  • Electrons-on-helium quantum computers (qubit is the electron spin)

  • Cavity quantum electrodynamics (CQED) (qubit provided by the internal state of trapped atoms coupled to high-finesse cavities)

  • Molecular magnet (qubit given by spin states)

  • Fullerene-based ESR quantum computer (qubit based on the electronic spin of atoms or molecules encased in fullerenes)

  • Nonlinear optical quantum computer (qubits realized by processing states of different modes of light through both linear and nonlinear elements)

  • Linear optical quantum computer (qubits realized by processing states of different modes of light through linear elements e.g. mirrors, beam splitters and phase shifters)

  • Diamond-based quantum computer (qubit realized by the electronic or nuclear spin of nitrogen-vacancy centres in diamond)

  • Bose-Einstein condensate-based quantum computer

  • Transistor-based quantum computer – string quantum computers with entrainment of positive holes using an electrostatic trap

  • Rare-earth-metal-ion-doped inorganic crystal-based quantum computers(qubit realized by the internal electronic state of dopants in optical fibres)

Metallic-like carbon nanospheres-based quantum computers

A large number of candidates demonstrates that quantum computing, despite rapid progress, is still in its infancy.


Relation to computability and complexity theory



Computability theory:

Any computational problem solvable by a classical computer is also solvable by a quantum computer. Intuitively, this is because it is believed that all physical phenomena, including the operation of classical computers, can be described using quantum mechanics, which underlies the operation of quantum computers.

Conversely, any problem solvable by a quantum computer is also solvable by a classical computer; or more formally, any quantum computer can be simulated by a Turing machine. In other words, quantum computers provide no additional power over classical computers in terms of computability. This means that quantum computers cannot solve undecidable problems like the halting problem and the existence of quantum computers does not disprove the Church–Turing thesis.

As of yet, quantum computers do not satisfy the strong Church thesis. While hypothetical machines have been realized, a universal quantum computer has yet to be physically constructed. The strong version of Church's thesis requires a physical computer, and therefore no quantum computer yet satisfies the strong Church thesis.


Quantum complexity theory:

While quantum computers cannot solve any problems that classical computers cannot already solve, it is suspected that they can solve certain problems faster than classical computers. For instance, it is known that quantum computers can efficiently factor integers, while this is not believed to be the case for classical computers.


The class of problems that can be efficiently solved by a quantum computer with bounded error is called BQP, for "bounded error, quantum, polynomial time". More formally, BQP is the class of problems that can be solved by a polynomial-time quantum Turing machine with an error probability of at most 1/3. As a class of probabilistic problems, BQP is the quantum counterpart to BPP ("bounded error, probabilistic, polynomial time"), the class of problems that can be solved by polynomial-time probabilistic Turing machines with bounded error. It is known that BPP{\displaystyle \subseteq }BQP and is widely suspected that BQP{\displaystyle \subsetneq }BPP, which intuitively would mean that quantum computers are more powerful than classical computers in terms of time complexity.



The suspected relationship of BQP to several classical complexity classes.

The exact relationship of BQP to P, NP, and PSPACE is not known. However, it is known that P{\displaystyle \subseteq }BQP{\displaystyle \subseteq }PSPACE; that is, all problems that can be efficiently solved by a deterministic classical computer can also be efficiently solved by a quantum computer, and all problems that can be efficiently solved by a quantum computer can also be solved by a deterministic classical computer with polynomial space resources. It is further suspected that BQP is a strict superset of P, meaning some problems are efficiently solvable by quantum computers that are not efficiently solvable by deterministic classical computers. For instance, integer factorization and the discrete logarithm problem are known to be in BQP and are suspected to be outside of P. On the relationship of BQP to NP, little is known beyond the fact that some NP problems that are believed not to be in P are also in BQP (integer factorization and the discrete logarithm problem are both in NP, for example). It is suspected that NP{\displaystyle \nsubseteq }BQP; that is, it is believed that there are efficiently checkable problems that are not efficiently solvable by a quantum computer. As a direct consequence of this belief, it is also suspected that BQP is disjoint from the class of NP-complete problems (if an NP-complete problem were in BQP, then it would follow from NP-hardness that all problems in NP are in BQP).


The relationship of BQP to the basic classical complexity classes can be summarized as follows:

{\displaystyle {\mathsf {P\subseteq BPP\subseteq BQP\subseteq PP\subseteq PSPACE}}}


It is also known that BQP is contained in the complexity class #P (or more precisely in the associated class of decision problems P#P), which is a subclass of PSPACE.

It has been speculated that further advances in physics could lead to even faster computers. For instance, it has been shown that a non-local hidden variable quantum computer based on Bohmian Mechanics could implement a search of an {\displaystyle N}-item database in at most {\displaystyle O({\sqrt[{3}]{N}})} steps, a slight speedup over Grover's algorithm, which runs in {\displaystyle O({\sqrt {N}})} steps. Note, however, that neither search method would allow quantum computers to solve NP-complete problems in polynomial time. Theories of quantum gravity, such as M-theory and loop quantum gravity, may allow even faster computers to be built. However, defining computation in these theories is an open problem due to the problem of time; that is, within these physical theories there is currently no obvious way to describe what it means for an observer to submit input to a computer at one point in time and then receive output at a later point in time.


FOLLOW US ON INSTAGRAM, FACEBOOK AND PINTEREST


DISCLAIMER

The information is provided by Tecquisition for general informational and educational purposes only and is not a substitute for professional legal advice. If you have any feedback, comments, requests for technical support or other inquiries, please mail us at tecqusition@gmail.com.

3 views0 comments

Recent Posts

See All

Comments


Post: Blog2_Post
bottom of page