13 min read

Share this article:

What are those weird things called quantum computers?

Are they ever going to work out?

Could they kill off encryption as we know it?

If so, what can we do about it, and how long do we have to get ready?

So many questions!

Personal computers have advanced enormously since home computing arrived on the scene nearly 50 years ago.

Processors go about 10,000 times faster, disks store about 100,000 times more, and RAM capacity is up to 1,000,000 times greater.

But today’s computers aren’t radically different in appearance and use from their 1970s counterparts, in much the same way as bicycles and automobiles have remained familiar in form and function, even though they’ve changed a lot in the interim.

On the other hand, if you were suddenly confronted by a contemporary quantum computer, looking like some sort of cyberpunk cross between a jet engine and a pipe organ, you might not even recognise it at first, let alone be able to get in the equivalent of the driver’s seat and figure out what to do with it:

Quantum computing, as envisaged by American physicists Paul Benioff and Richard Feynman in the 1980s, was intended as a way of building computers that could model and work with concepts from quantum mechanics in a way that regular computers couldn’t.

Loosely speaking, classical computers, such as our laptops and mobile phones, store their data as a sequence of binary digits, or *bits*, each of which represents either exactly zero or exactly one.

But quantum bits, or *qubits* as they’re known in the jargon, don’t work that way, because quantum mechanics embraces the concept of *superpositions*, where a system of quantum particles can exist internally in multiple states at the same time.

A quantum superposition only ‘collapses’ to a single state when you actually measure it.

Nobel prize-winner Erwin Schrödinger famously came up with an analogy likening the state of a quantum particle to a cat living invisibly inside a box.

At any point, the cat might be alive or dead, but you can’t tell which until you open the box and peek inside.

As long as you don’t look, you can treat it as though it is both alive *and* dead at the same time.

Even if you have in inkling which is the right answer, for example because you rate its current chance of being alive at 92% based on its age and last known health, you still can’t be sure until you open the box.

At that moment, the cat’s unknowable ‘superposition’ of being 92% alive and 8% dead becomes a 100% certainty either way.

Theoretically, you can simulate a collection of qubits using a regular computer, but you can’t assign one classical bit to store the value of one quantum bit.

That’s because you need to simulate more than just the ‘on’ or ‘off’ states of a collection of regular bits, typically denoted by the presence or absence of minute electric charges in a conventional RAM chip, or by the direction of a sequence of tiny magnetic fields in a hard disk.

You need to keep an ongoing record of what you can think of as the ‘balance of probabilities’ in the superposition of those qubits, in the same way that our quantum cat might invisibly have been both 92% alive and 8% dead, not merely one or the other.

In a convenient if mathematically casual sense, you can think of the final value when you read out a single qubit as being either “You have ended up at the North Pole of a spherical globe,” denoting 0, or “You have ended up at the South Pole,” denoting 1.

But for as long as you perform quantum calculations on that qubit *without actually measuring it*, you can think of its value as wandering across the entire surface of that globe, with the probability of it finally emerging as 0 or 1 determined by its final proximity to either pole.

In other words, to simulate a single qubit in a superposition of two states means that you need to keep careful track of two precise numbers that represent what are effectively a latitude and a longitude from which its final simulated value can be computed when you finally read it out.

This mathematical trick for representing and working with quantum states as points on a globe is known as the Bloch Sphere, if you would like to look it up, but we’re only going to describe it here, and you don’t need to know exactly how or why the mathematics works in what follows.

To make quantum computer simulations even more complicated, a collection of N qubits can exist in a superposition not of N different states but of 2^{N} states, so that managing just 32 qubits already requires keeping track of 2^{32} (more than 4 billion) simultaneous states.

And to simulate a superposition of 256 qubits by using conventional floating point numbers for your latitude/longitude pairs, you’d need more atoms than exist in the observable universe to construct the necessary RAM.

Also, quantum mechanics means that a real quantum computer, when performing a quantum calculation on a superposition of real qubits, essentially bakes the outcome of that calculation simultaneously into all the states of that superposition.

Simulating the internal behaviour of the superposition would therefore require you to calculate all those particle interactions sequentially in your conventional computer, which would take exponentially longer as the number of qubits in the superposition increased.

In short, you can see why Benioff and Feynman proposed the idea of building a true quantum computer for research purposes, instead of reyling on simulations, even if they weren’t sure how or even if such a device could ever be constructed.

At this point, you’re almost certainly thinking, “What if I could turn the exponential complexity of using a classical computer to simulate a quantum computer on its head, by using a quantum computer to simulate a massively parallel classical computer instead?”

After all, this whole business of quantum calculations being instantly mirrored in all the states of a superposition sounds rather like having access to a classical computer with as many processor cores in parallel as there are states in the superposition.

If a 128-qubit quantum computer could be programmed to simulate a classical computer capable of performing 2^{128} parallel calculations *at the same time*, wouldn’t that make impossible feats possible, such as trying out in one go every possible decryption key for an algorithm with 128-bit (16-byte) passwords?

Even though the superposition would collapse to just 128 bits when you finally read it out, surely that wouldn’t matter, given that you’re only interested in one answer out of all those 2^{128} possibilities?

But that’s not how superpositions work.

Even though 2^{N} superposition states are essentially commingled into the N quantum bits you started with, you don’t get to look inside the superposition to decide which of the 2^{N} states you want.

When you peek inside the box to settle whether the proverbial cat is alive or dead, your algorithm must, in some quantum-computable way, *already have removed from the superposition* as many unwanted or useless answers as you can, so that the single N-bit value that emerges when you collapse the superposition has a realistic chance of being the answer you want.

If you allow the superposition to encode all possible answers in the hope that you will read out the right one at the end, you might as well use an existing classical computer to choose an answer at random at the outset, and go with that.

Simply put, you can’t encode just any old algorithm into a quantum computer, because only certain types of calculation can be usefully ‘parallelised’ to use a superposition of qubits as if it were a giant collection of processor cores running in parallel.

The good news, therefore, if you’re worried about vital cybersecurity algorithms suddenly running exponentially faster, is that quantum computers are quite limited in the problems they can be programmed to solve.

But here’s the bad news.

Back in 1994, before any quantum computing devices that could run his code had been built, an American mathematician named Peter Shor figured out a quantum algorithm for performing a narrow range of tasks that just happen to be useful in cryptographic cracking.

Very loosely speaking, Shor’s algorithm is connected with an iterative computational process known as *modular exponentiation*.

Exponentiation is where you take a number that we’ll call `G`

and repeatedly multiply it by itself `r`

times to produce `G`

, as in calculating 5x5x5x5x5x5x5, written as 5^{r}^{7}, which comes out as 78125:

Modular exponentiation, however, is where you break the ever-ascending numeric staircase above by dividing each intermediate result by a prime number that we’ll call `N`

, and keeping only the remainder (known as the *modulus*) every time.

For example, if we have G=5 and N=7, then modular exponentiation goes like this:

As you probably remember from school, the inverse of an exponent is a logarithm, which deals with “What is G^{r}?” in reverse.

In other words, logarithms answer questions of the form: “If 5^{r} gives you 625, solve for **r**.” (We say that `r = log`

.)_{5}625

For regular exponentiation involving huge numbers, logs are much harder to calculate than exponents, but nevertheless straightforward using classical computers.

With regular exponents, there’s an obvious shortcut based on guessing the likely size of the logarithm from the size of the number whose log you are trying to find, and then looping around making your guess higher or lower each time until you home in on the right answer.

But modular logs don’t reliably get bigger as the exponent increases, because they bounce around in an unpredictable cycle of numbers between 1 and N.

For modular exponents involving very large numbers, it takes far too many tries, and thus far too long, to deal reliably with problems of the form, “If G^{r} mod N = 1, where G and N have hundreds of digits each, solve for **r**.”

(When we restrict ourselves to whole numbers in our modular calculations, as shown in the diagrams above, these logs are known as *discrete logarithms*, and the difficulty of computing them is referred to as the *discrete logarithm problem*.)

Indeed, the computational complexity of reversing modular exponentiation is an important part of modern cryptography.

If we could find a fast way to attack the discrete logarithm problem, we would have a practical trick for cracking the two most widely-used encryption systems used today for online security.

And that’s exactly what Shor’s algorithm aims to do.

To be clear, Shor’s quantum algorithm doesn’t directly solve the discrete logarithm problem on its own.

You still need a classical computer to pre-process your input, to pass it into a quantum computer for filtering, and then to retrieve and process the output to extract your final result.

Nevertheless, it could lead to a method for solving in hours or days a mathematical conundrum that a classical computer alone might take decadThe quantum apocalypse: What is post-quantum cryptography, and why do we need it? (Part 2 of 2) es or centuries to handle.

That’s because Shor’s quantum algorithm promises a way to crack:

**The problem of splitting the product of two huge prime numbers back into its constituent parts,**known as factors. The well-known RSA encryption system, still very widely used in web security and for the digital signing of software updates, depends for its security on large prime numbers being easy to multiply together, but hard to split apart by factoring.**The problem of reversing the repeated multiplication in modular exponentation.**Encryption algorithms based on so-called elliptic curves, which are commonly used instead of RSA these days because they’re faster and have smaller encryption keys, depend for their security on modular exponentiation being easy, but on reversing the process to extract discrete logarithms being unfeasibly hard.

Shor’s algorithm therefore threatens to undermine all the main types of encryption that we rely on today for online security, including for web browsing, online commerce, and verifying software updates.

Even if quantum computers were useless for anything else, including conducting research into quantum mechanics, which was the original idea behind them…

…you can be sure that intelligence services, governments, major cybercriminal gangs and numerous other groups would dearly love to get their hands on one for its cryptographic cracking potential alone.

In the breathless words of a 2024 article by audit company KPMG:

“Quantum computers are designed to use quantum physics for computing, which introduces unprecedented capabilities over traditional computation methods. […] Quantum computers will be able to break common encryption methods at an alarming speed.”

In short, if you’ve evThe quantum apocalypse: What is post-quantum cryptography, and why do we need it? (Part 2 of 2) er wondered why there’s quite so much media coverage of quantum computing, and quite so much money being pumped into both marketing and research, you now have the the answer.

But is the news really that bad, or do we still gave a glimmer of hope?

The World Economic Forum (WEF) isn’t quite as alarmist, suggesting instead that:

“[Q]uantum computers are able to create ‘multidimensional computational spaces’ which essentially means they can conduct billions of novel calculations at the same time. [… I]t’s not difficult to see that quantum computers have the potential to bypass the encryption locks that currently protect the world’s communications and data.”

The reason that quantum computers have the *potential to bypass* existing cryptographic systems, but haven’t got anywhere near doing so yet, is that it is agonisingly hard to make them reliable.

Like Schrödinger’s cat, which is both alive and dead *only so long as no one peeks into the box*, quantum superpositions only retain what the WEF refers to as ‘multidimensional computational spaces’ if they are impeccably insulated from external influences.

Even otherwise imperceptible intrusions of heat, light or vibration will cause a superposition to collapse before you’re ready, so you need either one set of perfectly insulated and isolated qubits, or a massive collection of almost-perfect qubits to make up for the inevitable errors.

For example, to use Shor’s algorithm to crack today’s common RSA keys of 2048 or 3072 bits, you would need access to several thousand perfectly reliable qubits, or somewhere between tens and hundreds of millions of so-called ‘noisy qubits’ to provide the redundancy needed to have any chance of getting out a correct answer.

But the current world record number of ‘noisy qubits’ in a single quantum computer seems to be just over 1000, far too few to be any use in cracking RSA, even with keys that are far smaller than the recommended minimum size these days.

We could be decades away from a quantum computer that would provide any help at all in cracking today’s cryptographic keys.

Some experts even argue that quantum computers are unlikely ever to produce useful results, with every increase in performance being unavoidably undermined by a decrease in reliability and a rise in error rates, thus continually cancelling out any so-called ‘quantum advantage’ so that you never end up ahead.

Quantum computers may never advance to the performance point at which they can unravel today’s encryption algorithms in hours instead of decades, or days instead of centuries, but they confront us with a problem nevertheless.

If a quantum computer manufacturer somewhere in the world ever constructs thousands of perfectly-behaved qubits that can reliably be placed into a superposition big enough to take on RSA or elliptic curve encryption keys, that manufacturer already has the algorithm it needs to deliver a killer blow to encryption worldwide.

Worse still, attackers who are intercepting and storing encrypted communications right now, even though they can’t make sense of them today, might suddenly find themselves able to unlock years’ worth of stolen trophy data that still needs to be kept confidential.

With this in mind, the US passed a law at the end of 2022, bureaucratically known as **Public Law 117-260 of the 117th Congress**, that introduces two important jargon terms:

“The rapid progress of quantum computing suggests the potential for adversaries of the United States to steal sensitive encrypted data today using classical computers, and wait until sufficiently powerful quantum systems are available to decrypt it.

[…] It is the sense of Congress that a strategy for the migration of information technology of the Federal Government to

post-quantum cryptographyis needed, […and] the government-wide and industry-wide approach to post-quantum cryptography [PQC] should prioritize developing applications, hardware intellectual property, and software that can be easily updated to supportcryptographic agility.”

Simply put, the law is instructing Americans, and thereby advising everyone else, to:

**Devise new encryption algorithms**that don’t depend for their continued success on quantum computers continuing to fail.**Get into a nimble cybersecurity position**where change for the better can happen quickly when needed.

Making these changes, however, brings a raft of new risks and challenges.

What sort of new algorithms should we go for? Do we all need to switch at the same time? How do we adapt our software to work correctly with these new algorithms?

What happens if we’re in such a hurry that we inadvertently adopt dud algorithms and have to change them all over again? On the other hand, what if we’re too cautious and get caught out anyway?

Join us next week in Part 2 as we look at where this fascinating but vital journey is taking us!

Paul Ducklin is a respected expert with more than 30 years of experience as a programmer, reverser, researcher and educator in the cybersecurity industry. Duck, as he is known, is also a globally respected writer, presenter and podcaster with an unmatched knack for explaining even the most complex technical issues in plain English. Read, learn, enjoy!

*Image of Google’s quantum computer componentry from Google’s official quantumai.google website.*

*Image of old-school cipher machine (the infamous Lorenz Schlüssel-Zusatz 42, known to British codebreakers as Tunny) thanks to Matt Crypto via Wikipedia.*

Share this article:

What it requires is practicality and reason.

What it requires is practicality and reason.

And security that won’t let you down. It's time to put an end to the cyber insanity once and for all.

No more paying for useless bells and whistles.

No more time wasted on endless security alerts.

No more juggling multiple technologies and contracts.

PORTAL

Join our newsletter to stay up to date on features and releases.

CONTACT

SolCyber XDR++™

SolCyber MDR++™

SolCyber Extended Coverage™

SolCyber Foundational Coverage™

Free Demo

7536