Quantum computing is one of those technologies that is so mysterious that the names of TV characters are dropped when they want to sound smart.

Quantum computing as an idea has been around for some time — the theoretical possibility was first presented by Yuri Manin and Richard Feynman in 1982. However, in the past few years, this area has become increasingly closer to practicality.

Companies like Google and Microsoft, as well as government agencies like the NSA, have been feverishly tinkering with quantum computers for years. A company called D-Wave has made and is selling devices that (although they are not proper computers and can only execute a few algorithms) use quantum properties and are another step towards being completely Turing-complete.» quantum machine.

It doesn’t seem unreasonable to say that breakthroughs could happen that would allow the construction of the first large-scale quantum computer within a decade.

So why all the interest? Why should you care? Computers are getting faster and faster. What is so special about quantum computers?

To explain why these machines are so important, we need to take a step back and figure out what quantum computers are and why they work. First, let’s talk about a concept called «runtime complexity».

What is Runtime Complexity?

One of the big surprises in the early days of computer science was the discovery that if you have a computer that solves a problem of a certain size in a certain amount of time, doubling the computer’s speed doesn’t necessarily allow it to solve problems. twice as much.

Some algorithms increase the overall execution time very, very quickly as the size of the problem increases — some algorithms can be completed quickly given 100 data points, but to complete an algorithm given 1000 data points would require an Earth-sized computer running for a billion years. Runtime complexity is a formalization of this idea: it looks at a curve of how fast the complexity of a task grows and uses the shape of that curve to classify an algorithm.

Typically, these complexity classes are expressed as functions. An algorithm that gets proportionally more complex as the dataset it operates on increases (for example, a simple count function) is called a function with runtime complexity equal to » ( t. E. For processing n data points required n units of time).

On the other hand, it can be called «linear» because when you plot a graph, you get a straight line. Other features may be n^2 or 2^n or n! (n factorial). These are polynomial and exponential. In the last two cases the exponentials grow so fast that in almost all cases they are impossible to solve for anything but very trivial examples.

Complexity of execution and cryptography

If you’re hearing this stuff for the first time and it sounds pointless and cryptic, let’s try to substantiate this discussion. Run-time complexity is critical to cryptography, which makes the decryption process much easier for people who know the secret key than for those who don’t. In an ideal cryptographic scheme, decryption should be linear if you have the key, and 2^k (where k is the number of bits in the key) if you don’t have one.

In other words, the best algorithm for decrypting a message without a key would be to simply guess at the possible keys, which is impossible for keys only a few hundred bits long.

For symmetric key cryptography (in which both parties have the ability to securely exchange a secret before they begin to communicate), this is quite simple. For asymmetric cryptography, this is more difficult.

Asymmetric cryptography, in which the encryption and decryption keys are different and cannot be easily calculated from each other, is a much more complex mathematical structure than symmetric cryptography, but it is also much more powerful: asymmetric cryptography allows you to have private conversations even over twisted lines! It also allows you to create «digital signatures» so you can verify who the message came from and that it hasn’t been tampered with.

These are powerful tools that form the basis of modern privacy: without asymmetric cryptography, users of electronic devices would not have reliable protection from prying eyes.

Because asymmetric cryptography is more difficult to build than symmetric cryptography, the standard encryption schemes in use today are not as strong as they could be: the most common encryption standard, RSA, can be broken if you can efficiently find the underlying factors of a very large number. The good news is that this is a very complex problem.

The best-known algorithm for dividing large numbers into their primes is called the total number field sieve, and has a runtime complexity that grows slightly slower than 2^n . As a consequence, keys need to be about ten times as long to provide similar security that people typically endure as a cost of doing business. The bad news is that the whole playing field changes when quantum computers enter the mix.

Quantum computers: changing cryptography

Quantum computers work because they can have multiple internal states at the same time through a quantum phenomenon called «superposition». This means that they can simultaneously attack different parts of the problem, divided into possible versions of the universe. They can also be set up so that the branches that solve the problem end up with the largest amplitude, so that when you open a window on the Schrödinger cat, the version of the internal state you are most likely to be presented with is the smug cat with the decoded message

For more information on quantum computers, check out our recent article on !

The upshot of this is that quantum computers aren’t just linearly faster than regular computers: speeding up by two, ten, or a hundred times doesn’t help much when it comes to traditional cryptography, which you cost hundreds of billions of times. too slow to process. Quantum computers support algorithms that have less runtime complexity than would otherwise be possible. This is what makes quantum computers fundamentally different from other future computing technologies such as graphene and memrister computing.

As a specific example, Shor’s algorithm, which can only be executed on a quantum computer, can account for large numbers in log(n)^3 times, which is significantly better than the best classic attack. Using the total number field sieve to compute a number with 2048 bits takes about 10^41 units of time, which is over a trillion trillion trillion. Using Shor’s algorithm, the same problem only takes about 1000 units of time.

The effect becomes more pronounced the longer the key is. This is the power of quantum computers.

Don’t get me wrong — quantum computers have many potential non-evil uses. Quantum computers can effectively solve the traveling salesman problem, allowing researchers to build more efficient delivery networks and design better circuits. Quantum computers already have powerful applications in artificial intelligence.

However, their role in cryptography will be disastrous. The encryption technologies that allow our world to function depend on the integer factorization problem, which is difficult to solve. RSA and related encryption schemes allow you to trust that you’re on the right website, that downloads aren’t riddled with malware, and that people aren’t spying on your internet browser (if you’re using Tor).

Cryptography keeps your bank account safe and protects the world’s nuclear infrastructure. When quantum computers become practical, all this technology stops working. The first organization to develop a quantum computer, if the world is still working on the technologies we use today, will be in a frighteningly powerful position.

So, is a quantum apocalypse inevitable? Can we do something about it? As it turned out… yes.

Post-quantum cryptography

There are several classes of encryption algorithms that, as far as we know, cannot be solved significantly faster on a quantum computer. They are collectively known as post-quantum cryptography and offer some hope that the world can move towards cryptosystems that remain secure in the world of quantum encryption.

Promising candidates include lattice encryption such as Error Ring Learning, which gains its security from an apparently difficult machine learning problem, and multivariate cryptography, which gains security from the difficulty of solving very large systems of simple equations. You can read more about this topic in the Wikipedia article. Beware: many of these things are complex, and you may find that your math background needs to be greatly expanded before you can really get into the details.

The takeaway from this is that post-quantum cryptos are very cool, but also very young. They need to work harder to be efficient and practical, and to demonstrate that they are safe. The reason we can trust cryptosystems is that we have thrown enough clinically paranoid geniuses at them long enough that any obvious flaws have been discovered by now, and researchers have proven the various characteristics that make them strong.

Modern cryptography relies on light as a disinfectant, and most post-quantum cryptographic schemes are simply too new to be trusted for world security. However, they get there, and with a little luck and a little preparation, security experts could complete the transition before the first quantum computer is ever powered up.

If they fail, however, the consequences can be dire. The thought of someone with such power is unsettling, even if you are optimistic about their intentions. The question of who develops a working quantum computer in the first place is one that everyone should keep a very close eye on as we enter the next decade.

Are you concerned about insecure cryptography for quantum computers? What do you take? Share your thoughts in the comments below!

Image Credits: Binary Ball Via Shutterstock

Похожие записи