Browse Definitions :
Definition

quantum computing

Quantum computing is an area of study focused on the development of computer based technologies centered around the principles of quantum theory. Quantum theory explains the nature and behavior of energy and matter on the quantum (atomic and subatomic) level. Quantum computing uses a combination of bits to perform specific computational tasks. All at a much higher efficiency than their classical counterparts. Development of quantum computers mark a leap forward in computing capability, with massive performance gains for specific use cases. For example quantum computing excels at like simulations.

The quantum computer gains much of its processing power through the ability for bits to be in multiple states at one time. They can perform tasks using a combination of 1’s, 0’s and both a 1 and 0 simultaneously. Current research centers in quantum computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory. In addition, developers have begun gaining access to quantum computers through cloud services.

Quantum computing began with finding its essential elements. In 1981, Paul Benioff at Argonne National Labs came up with the idea of a computer that operated with quantum mechanical principles. It is generally accepted that David Deutsch of Oxford University provided the critical idea behind quantum computing research. In 1984, he began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, publishing a breakthrough paper a few months later.

Quantum Theory

Quantum theory's development began in 1900 with a presentation by Max Planck. The presentation was to the German Physical Society, in which Planck introduced the idea that energy and matter exists in individual units. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.

This is a photo of a quantum computer.
The IBM Q System One was introduced in January 2019 and was the first quantum computing system for scientific and commercial use.

Quantum Theory

Quantum theory's development began in 1900 with a presentation by Max Planck. The presentation was to the German Physical Society, in which Planck introduced the idea that energy and matter exists in individual units. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.

The Essential Elements of Quantum Theory:

  • Energy, like matter, consists of discrete units; as opposed to a continuous wave.
  • Elementary particles of energy and matter, depending on the conditions, may behave like particles or waves.
  • The movement of elementary particles is inherently random, and, thus, unpredictable.
  • The simultaneous measurement of two complementary values -- such as the position and momentum of a particle -- is flawed. The more precisely one value is measured, the more flawed the measurement of the other value will be.

Further Developments of Quantum Theory

Niels Bohr proposed the Copenhagen interpretation of quantum theory. This theory asserts that a particle is whatever it is measured to be, but that it cannot be assumed to have specific properties, or even to exist, until it is measured. This relates to a principle called superposition. Superposition claims when we do not know what the state of a given object is, it is actually in all possible states simultaneously -- as long as we don't look to check.

To illustrate this theory, we can use the famous analogy of Schrodinger's Cat. First, we have a living cat and place it in a lead box. At this stage, there is no question that the cat is alive. Then throw in a vial of cyanide and seal the box. We do not know if the cat is alive or if it has broken the cyanide capsule and died. Since we do not know, the cat is both alive and dead, according to quantum law -- in a superposition of states. It is only when we break open the box and see what condition the cat is in that the superposition is lost, and the cat must be either alive or dead.

The principle that, in some way, one particle can exist in numerous states opens up profound implications for computing.

A Comparison of Classical and Quantum Computing

Classical computing relies on principles expressed by Boolean algebra; usually Operating with a 3 or 7-mode logic gate principle. Data must be processed in an exclusive binary state at any point in time; either 0 (off / false) or 1 (on / true). These values are binary digits, or bits. The millions of transistors and capacitors at the heart of computers can only be in one state at any point. In addition, there is still a limit as to how quickly these devices can be made to switch states. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply.

The quantum computer operates with a two-mode logic gate: XOR and a mode called QO1 (the ability to change 0 into a superposition of 0 and 1). In a quantum computer, a number of elemental particles such as electrons or photons can be used. Each particle is given a charge, or polarization, acting as a representation of 0 and/or 1. Each particle is called a quantum bit, or qubit. The nature and behavior of these particles form the basis of quantum computing and quantum supremacy. The two most relevant aspects of quantum physics are the principles of superposition and entanglement.

Superposition

Think of a qubit as an electron in a magnetic field. The electron's spin may be either in alignment with the field, which is known as a spin-up state, or opposite to the field, which is known as a spin-down state. Changing the electron's spin from one state to another is achieved by using a pulse of energy, such as from a laser. If only half a unit of laser energy is used, and the particle is isolated the particle from all external influences, the particle then enters a superposition of states. Behaving as if it were in both states simultaneously.

Each qubit utilized could take a superposition of both 0 and 1. Meaning, the number of computations a quantum computer could take is 2^n, where n is the number of qubits used. A quantum computer comprised of 500 qubits would have a potential to do 2^500 calculations in a single step. For reference, 2^500 is infinitely more atoms than there are in the known universe. These particles all interact with each other via quantum entanglement.

In comparison to classical, quantum computing counts as true parallel processing. Classical computers today still only truly do one thing at a time. In classical computing, there are just two or more processors to constitute parallel processing.
Entanglement Particles (like qubits) that have interacted at some point retain a type can be entangled with each other in pairs, in a process known as correlation. Knowing the spin state of one entangled particle - up or down -- gives away the spin of the other in the opposite direction. In addition, due to the superposition, the measured particle has no single spin direction before being measured. The spin state of the particle being measured is determined at the time of measurement and communicated to the correlated particle, which simultaneously assumes the opposite spin direction. The reason behind why is not yet explained.

Quantum entanglement allows qubits that are separated by large distances to interact with each other instantaneously (not limited to the speed of light). No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated.

Taken together, quantum superposition and entanglement create an enormously enhanced computing power. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00, 01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers simultaneously. This is because each qubit represents two values. If more qubits are added, the increased capacity is expanded exponentially.

Quantum Programming

Quantum computing offers an ability to write programs in a completely new way. For example, a quantum computer could incorporate a programming sequence that would be along the lines of "take all the superpositions of all the prior computations." This would permit extremely fast ways of solving certain mathematical problems, such as factorization of large numbers.

The first quantum computing program appeared in 1994 by Peter Shor, who developed a quantum algorithm that could efficiently factorize large numbers.

The Problems - And Some Solutions

The benefits of quantum computing are promising, but there are huge obstacles to overcome still. Some problems with quantum computing are:

  • Interference - the slightest disturbance in a quantum system can cause a quantum computation to collapse, a process known as de-coherence. A quantum computer must be totally isolated from all external interference during the computation phase. Some success has been achieved with the use of qubits in intense magnetic fields, using ions.
  • Error correction - Qubits are not digital bits of data and cannot use conventional error correction. Error correction is critical in quantum computing, where even a single error in a calculation can cause the validity of the entire computation to collapse. There has been considerable progress in this area, however. With an error correction algorithm developed that utilizes 9 qubits -- 1 computational and 8 correctional. More recently, there was a breakthrough by IBM that makes do with a total of 5 qubits (1 computational and 4 correctional).
  • Output observance - Retrieving output data after a quantum calculation is complete risks corrupting the data. Developments have since been made, such as a database search algorithm that relies on the special "wave" shape of the probability curve in quantum computers. This ensures that once all calculations are done, the act of measurement will see the quantum state decohere into the correct answer.

There are many problems to overcome, such as how to handle security and quantum cryptography. Long time quantum information storage has been a problem in the past too. However, breakthroughs in the last 15 years and in the recent past have made some form of quantum computing practical. There is still much debate as to whether this is less than a decade away or a hundred years into the future. However, the potential that this technology offers is attracting tremendous interest from both the government and the private sector. Military applications include the ability to break encryptions keys via brute force searches, while civilian applications range from DNA modeling to complex material science analysis.

This was last updated in June 2020

Continue Reading About quantum computing

SearchCompliance
  • pure risk

    Pure risk refers to risks that are beyond human control and result in a loss or no loss with no possibility of financial gain.

  • risk reporting

    Risk reporting is a method of identifying risks tied to or potentially impacting an organization's business processes.

  • chief risk officer (CRO)

    The chief risk officer (CRO) is the corporate executive tasked with assessing and mitigating significant competitive, regulatory ...

SearchSecurity
  • script kiddie

    Script kiddie is a derogative term that computer hackers coined to refer to immature, but often just as dangerous, exploiters of ...

  • cipher

    In cryptography, a cipher is an algorithm for encrypting and decrypting data.

  • What is risk analysis?

    Risk analysis is the process of identifying and analyzing potential issues that could negatively impact key business initiatives ...

SearchHealthIT
SearchDisasterRecovery
  • What is risk mitigation?

    Risk mitigation is a strategy to prepare for and lessen the effects of threats faced by a business.

  • fault-tolerant

    Fault-tolerant technology is a capability of a computer system, electronic system or network to deliver uninterrupted service, ...

  • synchronous replication

    Synchronous replication is the process of copying data over a storage area network, local area network or wide area network so ...

SearchStorage
  • gigabyte (GB)

    A gigabyte (GB) -- pronounced with two hard Gs -- is a unit of data storage capacity that is roughly equivalent to 1 billion ...

  • MRAM (magnetoresistive random access memory)

    MRAM (magnetoresistive random access memory) is a method of storing data bits using magnetic states instead of the electrical ...

  • storage volume

    A storage volume is an identifiable unit of data storage. It can be a removable hard disk, but it does not have to be a unit that...

Close