At a Glance – Bits, Qubits & Qudits

Accelerating Quantum Computing

A bit is a basic unit of information. Bits are used to measure the information in digitalised data and the speed at which it travels. A regular bit can exist in one of two forms: 0 or 1.

A qubit (quantum bit), on the other hand, can exist as both at the same time. It takes eight regular bits to store any number between 0 and 256 on an average computer, but with eight qubits, a quantum computer is able to store all of those numbers. A quantum computer with 300 linked qubits could perform more calculations at one time than there are atoms in the universe. If that isn’t impressive enough, scientists have taken things further with a new quantum unit of information – a qudit.

Simply put, a qudit is a better qubit. Qubits are incredibly powerful, but they’re notoriously difficult to create. There are multiple different methods for making qubits, but none can be easily scaled up. Qudits could present a more practical way to carry out quantum calculations as they’re more straightforward to make.

Canada’s National Institute of Scientific Research in Quebec has created a microchip that can generate two entangled qudits by manipulating protons emitted from a microring resonator. The qudits have more power than six connected qubits, and the system doesn’t need state of the art tech. Cost is a major boundary to quantum research, but as qudits are made using commercial technology, they have the potential to make developments much cheaper.

Qudits could be used to tackle perplexing mathematical problems, advance quantum communications and handle giant datasets that are too challenging for qubits. The new unit of information may also benefit cybersecurity, securing data with quantum encryption. Easy to make and mind-blowingly powerful, qudits could be instrumental in the acceleration of quantum computing.