Blog
Article . IT Services . Financial Software . AI & BigData . HealthTech . Edtech

Quantum computing: An overview

24/06/2021
download Download research

The Quantum in quantum computing refers to quantum mechanics used by the system to calculate outputs. 

In physics, a quantum is the smallest possible discrete unit of any physical property, and usually refers to properties of atomic or subatomic particles, such as electrons, neutrinos, and photons. Quantum mechanics emerged as a branch of physics in the early 1900s, to explain nature on the scale of atoms, and has led to major technological advances such as:

  • Transistors
  • Lasers
  • Solar cells
  • Electron microscopes
  • Atomic clocks 
  • Magnetic resonance imaging (MRI)

 

The first wave of quantum technologies gave us the transistor, which became the foundation of modern computers and digital communication. Quantum computing combines two major scientific and technological advances of the past century: quantum mechanics and computer technology.

Quantum computing harnesses the unique behaviour of quantum physics, such as superposition and entanglement, and applies it to computing. 

Superposition: In superposition, quantum particles are a combination of all possible states.

They fluctuate until they are observed and measured. Taking the example of a coin, classical computing would be able to measure a coin toss as heads or tails. A quantum computing is more nuanced and is able to look at the coin and see both heads and tails at the same time, as well as every state in between, the coin would be in superposition.

Entanglement: Entanglement refers to the ability of quantum particles to correlate their measurement results with each other. This enables the particles to form a single system and influence each other and, allows the measurements from one to draw conclusions about the others. Quantum computers can hence calculate exponentially more information and solve more complicated problems. 

 

Quantum computing (Qubits) versus traditional computing (Bits)

A traditional computing system, which are currently ubiquitous, use Bits, which are like tiny switches, that can either be in the off position (represented by a zero) or in the on position (represented by a one). Every app/ program, website, steaming video or photograph is ultimately made up of millions of these bits in some combination of ones and zeroes.

Quantum computing utilises Qubits. Like bits, qubits are the basic unit of information in quantum computing, but they behave very differently. Rather than just being on/off, qubits can also be in what’s called superposition: where they are both on and off at the same time, or somewhere on a spectrum between the two. As qubits can assume many states at the same time, they are also able to perform more arithmetic operations than conventional bits.

In theory, the computing power of quantum computers can increase exponentially with the number of qubits. However in practice things are more complicated, with the theory only stacking up so long as all other conditions are met. For instance, the error rate needs to be minimal.

The quantum entanglement between the qubits has to work properly, with even the smallest malfunctions resulting in system failure. The challenge for quantum computing developers is therefore not only to get increasing number of qubits onto the chip, but equally or arguably more importantly, maintain precision. 

Quantum computing is useful for applications where there is a large, uncertain complicated system that needs to be simulated

These applications could range things as diverse as: forecasting financial markets; improving weather forecasts; modelling the behaviour of individual electrons to understand quantum physics; to digital cryptography to improve data encryption. 


The Quantum computing revolution

Would you like to know more about this topic?

Download our latest research “The Quantum computing revolution”

Download now

Download now