Saturday, April 01, 2006

SIGFPE's Law

I've decided it's time to stake my claim to the law that will ever be remembered and linked with my name. If Moore can get a law named after him, or even Murphy, then so can I.



SIGFPE's Law

This law is the prediction that the rate of growth of the number of bits on the best available quantum computers is going to grow linearly with time.


My reason for saying this is essentially the same as the reason that many people are sceptical of quantum computers - decoherence. It seems obvious to me that even with the cleverest of error correction schemes we're going to see the coherence of states decay exponentially fast. I'm bored of reading papers on quantum error correction and seeing that every one makes some assumption or other about the Hamiltonian that is not going to hold in practice. And any little deviation from that assumption becomes the coefficient in an exponential describing that decay.


On the other hand people are ingenious. As time goes by people will find ever more ingenious ways to fix the errors caused by decoherence. But the difficulty is going to grow exponentially and the solutions won't scale. So just as classical computing has seen an exponential growth in power I expect to see the logarithm of this and hence a linear growth. Maybe, just maybe, there'll be polynomial growth. But definitely not exponential growth. In fact, I expect it to be linear with a pretty shallow gradient, just a couple of qubits a year at most, and probably less. Humans will, one day in the distant future, crack large RSA keys using quantum computers - but when that day is reached humans will be well beyond caring about such trivial matters.


I also have nothing to lose by saying this. If I turn out to be wrong, and I sincerely hope I am, I can look forward to a world full of wonderful computing machines whose interest value will far outweigh the ignominy of being forever linked with an incorrect law.


BTW When I say n-bit quantum computer I mean a computer whose total memory is n qubits, not a computer with the qubits addressable by an n-bit word.

:-)

1 Comments:

Blogger Stefan Ciobaca said...

I hope you're right. If you are, I think cryptography will benefit more in practice from quantum computers than Eve.

I think the first commercial application of quantum computing would require just one qubit: generate a really random classical bit.

Wednesday, 23 April, 2008  

Post a Comment

<< Home