A tale of two Qubits
In 1981, famous physicist Richard Feynman questioned whether it was possible to simulate physics on a computer. The closest answer to this question was the development of quantum computers.
Quantum computing focuses on the growth of computer technology based on the principles of quantum theory, which explains energy and matter on an atomic and subatomic level. Quantum theory is based on quantum mechanics, which is a branch of physics that focuses on the behaviour of atoms and protons. Quantum computing applies this scientific wizardry to data, using the power of atoms to perform tasks that are well beyond the capabilities of a regular computer. If it sounds complicated, that’s because it is.
In a conventional computer, it takes eight bits (basic unit of information) to store one number between 0 and 256. In a quantum computer, eight qubits (you guessed it – quantum basic units of info) can store all of these numbers. Regular bits can exist as either 0 or 1 – but qubits can be both. This lets them run billions of copies of any computation at the same time. Basically, quantum computers can tackle processing problems on a mass scale.
So, why is quantum computing important?
In a world of big data, it’s becoming more and more necessary to deal with vast amounts of information. Applications of quantum computing include complicated calculations, memory tasks, virtual experiments and generally dealing with mass data.
Potential challenges to quantum computing include the daunting task of actually creating quantum computers and, for corporations, the sheer cost of buying one. D-Wave were the first company to sell a quantum computer for a cool $10 million USD. However, with time, quantum computing will revolutionise a number of sectors including aerospace, healthcare and Artificial Intelligence – just three areas which rely on the storage of data. And that’s not all – NASA and Google have recently partnered up to buy a quantum computer to help locate inhabitable planets.