Scientists and SF buffs have been dreaming of radically transforming classical computing’s limitations for decades. As real-world advances start to take shape, quantum dreams become much closer to reality. How are quantum computers different from conventional ones? How will they benefit us? What challenges will the advent of quantum computing bring?
Join us as we glimpse into a not-so-distant quantum future.
How Do Quantum Computers Work?
To understand quantum computing operation, we need to compare it to computers founded on 20th-century principles. A traditional computer uses bits, which can be in one of two binary states expressed as 0 and 1. It uses billions of transistors to perform calculations one task at a time. The more transistors such a computer has, the faster it becomes at general computing.
QCs rely on qubits or quantum bits instead. Along with 0 and 1, each cubit can take on any value that’s a linear combination of these states. This is called superposition. It’s the basis for unlocking quantum computing’s advantages.
A classical computer is rigid in comparison. It solves tasks by brute force, calculating possible outcomes one at a time until it reaches the correct one. Advances in transistor scaling and processing unit power consistently speed up general work. However, certain task types take modern supercomputers millennia to calculate and will continue to be impractical despite continuing advances.
Conversely, QCs work with probabilities. They use a multi-dimensional approach to explore countless solutions to problems and highlight the most likely ones to be correct. While imperfect, this approach cuts the time needed to perform specific calculations down to a tiny fraction compared to today’s most advanced supercomputers.
In 2019, using just 54 qubits, Google’s Sycamore QC prototype managed to perform a calculation in 200 seconds researchers claim conventional supercomputers would need 10,000 years to solve. While a proof of concept more than anything else, the test demonstrated quantum supremacy.
Several companies are working on advancing quantum computing in different directions. Intel plans to introduce a 1,121-qubit processor by the end of 2023, while error suppression & mitigation advancements should arrive in 2024.
What Advantages do QCs Have?
QCs excel at making sense of seemingly random information. They can take huge amounts of data and create complex probabilistic models that humans can then consult to make informed decisions.
Predicting complex behaviors has revolutionary implications for many industries. Chemistry, physics, and biology will likely reap the first benefits since quantum computing lends itself well to simulating particle-based systems. For example, chemists could synthesize virtual molecules and create accurate models of their behavior & interactions without having to synthesize these molecules in a lab.
Quantum computing’s probabilistic problem-solving will have a transformative impact on the world of finance, too. Applied to traffic, their ability to juggle untold variables and figure out the most optimum route at any given time will be a boon to logistics. Machine learning and AI are already growing by leaps & bounds. They will continue to grow even more rapidly once QCs become faster and more readily available.
Speaking of growth, QCs have an exponential advantage. Simplified, a computer with twice as many transistors is twice as fast as the one it replaces. A QC equipped with twice as many qubits would be 2n faster, where n is the number of qubits. Even without doubling the qubit count, QCs can advance at a much more rapid generational pace.
While it’s advancing rapidly, it’s important to note that quantum computing is still in the early development stages.
What Are Quantum Computing’s Risks & Challenges?
Provided we aren’t prepared, practical QC use could create a major upheaval.
Specifically, it would render current encryption methods useless. For example, we trust VPNs since they encrypt your entire connection. There are many VPN uses, but the main one is that you can securely browse the internet or connect to your company’s network without fear of snooping or data theft. So this security tool should advance in a different way in order to do the same job on quantum computers.
Some encryption key generation methods are based on prime number factoring, i.e., determining which prime numbers need to be multiplied to gain a given result. Such calculations stump classic computers, but quantum ones will be able to compute them and crack the code in no time.
If such computers would fall into the wrong hands, that would lead to unprecedented cybersecurity risks. Luckily, organizations like the National Institute of Standards are already developing post-quantum encryption methods that will combat such threats.
Specific use cases
QCs mop the floor with conventional ones only when asked to do a narrow subset of very specific tasks. Operators need to pose their problems in ways that a QC’s algorithm can make sense of. Moreover, classic computers still vastly outperform them for tasks like calculating the digits of pi.
Hybrid quantum and classical algorithms are showing promise, however. A true QC that can complete any task equally well is a matter of if, not when.
Maintenance & accuracy issues
A quantum CPU (central processing unit) is no larger than conventional wafers. Still, QCs take up a lot of space due to the hardware required to keep them near absolute zero. They need such a cold environment to achieve superconductivity, a state in which electrons coursing through the QC encounter no resistance.
It’s almost impossible to eliminate some degree of the qubit’s interaction with the outside world. Such interactions lead to decoherence, a state where superposition breaks down.
Decoherent qubits enter either binary state, so their results are erroneous. Some noise would exist in ideal circumstances due to QC’s probabilistic nature. However, decoherence introduces more noise and makes results unreliable.