Quantum computers represent a leap forward in processing power — promising to outperform even the most powerful supercomputers today.
They do this by harnessing some of the unusual properties of quantum physics to increase their ability to crack incredibly complex algorithms. As a result, quantum computers have profound implications for things like medical research, artificial intelligence — and modern security practices.
But quantum computers are still in their infancy and are far from a proven technology.
What is it?
Quantum computing is based on quantum bits (known as qubits) that can exist as a 1 or 0 quantum state — or in a ‘superposition’ state somewhere between the two. By comparison, traditional binary bits can only exist as a 1 or 0, with no state in between.
Quantum computers use one of several methods to create and maintain a quantum state that allows for the manipulation of qubits. By manipulating qubits, enterprises can take advantage of a new type of computing that is better suited to processing complex-algorithm-based workloads in a reasonable time frame — things like AI training, research algorithms and encryption.
What’s in for you?
Unlike traditional bits, qubits can scale exponentially. For example, tackling a complex problem like decrypting RSA-encrypted communications would require an extreme level of traditional computing power and a significant amount of time. By comparison, a quantum computer could crack the same algorithm much faster. The US National Institute of Standards and Technology (NIST) understands this and in December 2016 asked the public to submit post-quantum algorithms that potentially could resist a quantum computer’s onslaught. Of the submissions, 26 made the cut for potential standardization, and the cryptography community was asked to analyze their performance.
That opens up new possibilities in any field that demands a large amount of processing power. In research, quantum computers are being experimented with to extract insights from colossal data sets, and run projects at a scale never before seen. It also has the potential to enable incredibly detailed and reliable simulations, helping organizations experiment and innovate with confidence.
What are the trade offs?
Traditional binary computers are still the best option to tackle the vast majority of tasks within your enterprise.
While unmatched in their power, quantum computers are expensive, difficult to maintain and still very early on in their development. Physically, maintaining the quantum state required to manipulate qubits requires an operating temperature close to absolute zero — a costly and complex undertaking in itself.
The other major challenge with quantum computing is a lack of tools to support development, as existing tools for binary computers will not work in a quantum computer. Traditional computing has existed long enough that all kinds of programming languages, tools and processes have been created to support developers. Once quantum computing technology can scale, these tools must be developed once more to support the new generation of quantum coders.
How is it being used?
Quantum computing is very much still in the research phase, with only a handful of technology companies owning and operating quantum computers—often in small scale projects to test the limits of what’s possible today.
When fully developed, quantum computers could be used by leading innovators to develop new battery technology for electric and autonomous vehicles, support chemistry simulation workloads, develop new materials and perform complex medical/drug research tasks.
In practice, quantum computers are being experimented with for a variety of workloads. Other common examples include:
- Encrypting and decrypting communications
- Training machine learning models
- Simulating quantum mechanical systems
- Optimizing systems and calculating risk