
What is it about the word “quantum” that sucks the brains out of otherwise reasonable people? There has to be some sort of Heisenberg-Schödinger Credulity Principle that makes all the ideons in their brains go spin-up at the same time, and I’m quite sure that the Many Worlds Interpretation of it has the most merit. (In case you’re a QM n00b, the ideon is the quantum unit of belief.) Fortunately, there seems to be some sanity coming to reporting about quantum computing.
Just about every quantum computing article has a part in it that notes that there are quantum algorithms to break public crypto. The articles breathlessly explain that this means that SSL will be broken and the entire financial world will be in ruins, followed by the collapse of civilization as we know it. Otherwise sensible people focus on this because there’s very little to sink your teeth into in quantum computing otherwise. Even certified experts know that they don’t know what they don’t know.
Scott Aaronson has a good article in Scientific American called “The Limits of Quantum Computers” (only the preview is free, sorry) that gives a good description of what quantum computers can’t do. I’m pleased to see this. SciAm has been a HSCP-induced quantum cheerleader over the last few years.
I have been doing some research on the claims of quantum computing. I decided to pick the specific factoring ability of quantum computers, and produce some actual numbers about how we might expect quantum computing to develop. In other words, I’m going to be a party pooper.
The crypto-obviating algorithms in question are Shor’s algorithm for factoring and an algorithm he developed for discrete logs. I was surprised to learn that Shor’s algorithm requires 72k3 quantum gates to be able to factor a number k bits long. Cubed is a somewhat high power. So I decided to look at a 4096-bit RSA key, which is the largest that most current software supports — the crypto experts all say that if you want something stronger, you should shift to elliptic curve, and the US government is pushing this, too, with their “Suite B” algorithms.
To factor a 4096-bit number, you need 72*40963 or 4,947,802,324,992 quantum gates. Lets just round that up to an even 5 trillion. Five trillion is a big number. We’re only now getting to the point that we can put about that many normal bits on a disk drive. The first thing this tells me is that we aren’t going to wake up one day and find out that someone’s put that many q-gates on something you can buy from Fry’s from a white-box Taiwanese special.
A complication in my calculations is the relationship between quantum gates and quantum bits. For small numbers of qubits, you get about 200 qugates per qubit. But qubits are rum beasts. There are several major technologies that people are trying to tease qubits out of. There’s the adiabatic techlogies that D-Wave is trying. There are photon dots, and who knows how many semiconductor-based methods.
It isn’t clear that any of these have any legs. Read Scott Aaronson’s harumphing at D-Wave, more pointed yet sympathetic faint praise and these educated doubts on photonics. Interestingly, Aaronson says that adiabatic quantum computers like D-Wave need k11 gates rather than k3 gates, which pretty much knocks them out of viability at all, if that’s so.
But let’s just assume that they all work as advertised, today. My next observation is that probably looking at billions of q-bits to be able to get trillions of q-gates. My questions to people who know about the relationship between quantum gates and quantum bits yielded that the real experts don’t have a good answer, but that 200:1 ratio is more likely to go down than up. Intel’s two-billion transistor “Tukwila” chip comes out this year. Five trillion is a big number. We are as likely to need 25 billion qbits to factor that number as any other good guess. Wow.
The factoring that has been done on today’s quantum computers is of a four-bit number, 15. If you pay attention to quantum computing articles, you’ll note they always factor 15. There’s a reason for this. It’s of the form (2n-1) * ( 2n+1). In binary, 2n-1 is a string of all 1 bits. A number that is 2n+1 is a 1 bit followed by a string of 0s, and then a 1 again. These numbers are a special form that is easy to factor, and in the real world not going to occur in a public key.
This is not a criticism, it’s an observation. You have to walk before you can run, and you have to factor special forms before you can factor the general case. Having observed that, we’ll just ignore it and assume we can factor any four-bit number today.
Let’s presume that quantum computers advance in some exponential curve that resembles Moore’s Law. That is to say that there is going to be a doubling of quantum gates periodically, and we’ll call that period a “generation.” Moore’s specific observation about transistors had a generation every eighteen months.
The difference between factoring four bits and factoring 4096 bits is 30 generations. In other words, 72*43 * 230 = 72*40963. If we look at a generation of eighteen months, then quantum computers will be able to factor a 4096-bit number in 45 years, or on the Ides of March, 2053.
This means to me that my copy of PGP is still going to be safe to use for a while yet. Maybe I oughta get rid of the key I’ve been using for the last few years, but I knew that. I’m not stupid, merely lazy.
I went over to a site that will tell you how long a key you need to use, http://www.keylength.com/. Keylength.com uses estimates made by serious cryptographers for the life of keys. They make some reasonable assumptions and perhaps one slightly-unreasonable assumption: that Moore’s Law will continue indefinitely. If we check there for how long a 4096-bit key will be good for, the conservative estimate is (drum roll, please) — the year 2060.
I’m still struck by how close those dates are. It suggests to me that if quantum computers continue at a rate that semiconductors do, they’ll do little more than continue the pace of technological advancement we’ve seen for the past handful of decades. That’s no mean feat — in 2053, I doubt we’re going to see Intel trumpeting its 45 picometer process (which is what we should see after 30 generations).
I spoke to one of my cryptographer friends and outlined this argument to him. He said that he thinks that the pace of advancement will pick up and be faster than a generation every eighteen months. Sure. I understand that, myself. The pace of advancement in storage has been a generation every year, and in flash memory it’s closer to every nine months. It’s perfectly conceivable that quantum computing will see horrible progress for the next decade and then whoosh off with a generation ever six months. That would compress my 45 years into 25, which is a huge improvement but still no reason to go begging ECRYPT for more conferences.
On the other hand, it’s just as conceivable that quantum computing will end up on the Island of Misfit Technologies, along with flying cars, personal jetpacks, Moon colonies, artificial intelligence, and identity management.
But I also talked to a bigwig in Quantum Information Theory (that’s quantum computing and more) and gave him a sketch of my argument. I heard him speak about Quantum Information and he gave the usual Oooooo Scary Quantum Computers Are Going to Factor Numbers Which Will Cause The Collapse of All Financial Markets And Then We Will All DIEEEEE — So That’s Why We Need More Research Money boosterism.
He wouldn’t let me attribute anything to him, which I understand completely. We live in a world in which partisanship is necessary and if he were seen putting down the pompoms, he’d be fired. Telling middle-aged technocrats that the math says their grandkids are going to see quantum computers shortly before they retire will cause the research money dry up, and if that happens then — well, the world won’t end. And then where would we be?
Nonetheless, he said to me sotto voce, “There’s nothing wrong with your math.”
Photo is a detail from “Shop Full” by garryw16.