« Fourmilab Beats Google in Google's Accessible Search | Main | Puzzle: What the Sam Hill? »
Friday, July 21, 2006
Reading List: Programming the Universe
- Lloyd, Seth. Programming the Universe. New York: Alfred A. Knopf, 2006. ISBN 1-4000-4092-2.
-
The author has devoted his professional career to exploring the deep
connections between information processing and the quantum
mechanical foundations of the universe. Although his doctorate
is in physics, he is a professor of mechanical engineering at
MIT, which I suppose makes him an honest to God quantum mechanic.
A pioneer in the field of quantum computation, he suggested the
first physically realisable quantum computational device, and is
author of the landmark papers which evaluated the
computational power of
the “ultimate laptop”computer
which, if its one kilogram
of mass and one litre of volume crunched any faster, would collapse into a
black hole; estimated the
computational capacity
of the entire visible universe; and explored how
gravitation and spacetime
could be emergent properties of a universal quantum computation.
In this book, he presents these concepts to a popular audience, beginning by explaining the fundamentals of quantum mechanics and the principles of quantum computation, before moving on to the argument that the universe as a whole is a universal quantum computer whose future cannot be predicted by any simulation less complicated than the universe as a whole, nor any faster than the future actually evolves (a concept reminiscent of Stephen Wolfram's argument in A New Kind of Science, but phrased in quantum mechanical rather than classical terms). He argues that all of the complexity we observe in the universe is the result of the universe performing a computation whose input is the random fluctuations created by quantum mechanics. But, unlike the proverbial monkeys banging on typewriters, the quantum mechanical primate fingers are, in effect, typing on the keys of a quantum computer which, like the cellular automata of Wolfram's book, has the capacity to generate extremely complex structures from very simple inputs. Why was the universe so simple shortly after the big bang? Because it hadn't had the time to compute very much structure. Why is the universe so complicated today? Because it's had sufficient time to perform 10122 logical operations up to the present.
I found this book, on the whole, a disappointment. Having read the technical papers cited above before opening it, I didn't expect to learn any additional details from a popularisation, but I did hope the author would provide a sense for how the field evolved and get a sense of where he saw this research programme going in the future and how it might (or might not) fit with other approaches to the unification of quantum mechanics and gravitation. There are some interesting anecdotes about the discovery of the links between quantum mechanics, thermodynamics, statistical mechanics, and information theory, and the personalities involved in that work, but one leaves the book without any sense for where future research might be going, nor how these theories might be tested by experiment in the near or even distant future. The level of the intended audience is difficult to discern. Unlike some popularisers of science, Lloyd does not shrink from using equations where they clarify physical relationships and even introduces and uses Dirac's “bra-ket” notation (for example, <φ|ψ>), yet almost everywhere he writes a number in scientific notation, he also gives it in the utterly meaningless form of (p. 165) “100 billion billion billion billion billion billion billion billion billion billion” (OK, I've done that myself, on one occasion, but I was having fun at the expense of a competitor). And finally, I find it dismaying that a popular science book by a prominent researcher published by a house as respectable as Knopf at a cover price of USD26 lacks an index—this is a fundamental added value that the reader deserves when parting with this much money (especially for a book of only 220 pages). If you know nothing about these topics, this volume will probably leave you only more confused, and possibly over-optimistic about the state of quantum computation. If you've followed the field reasonably closely, the author's professional publications (most available on-line), which are lucidly written and accessible to the non-specialist, may be more rewarding.
I remain dubious about grandiose claims for quantum computation, and nothing in this book dispelled my scepticism. From Democritus all the way to the present day, every single scientific theory which assumed the existence of a continuum has been proved wrong when experiments looked more closely at what was really going on. Yet quantum mechanics, albeit a statistical theory at the level of measurement, is completely deterministic and linear in the evolution of the wave function, with amplitudes given by continuous complex values which embody, theoretically, an infinite amount of information. Where is all this information stored? The Bekenstein bound gives an upper limit on the amount of information which can be represented in a given volume of spacetime, and that implies that even if the quantum state were stored nonlocally in the entire causally connected universe, the amount of information would be (albeit enormous), still finite. Extreme claims for quantum computation assume you can linearly superpose any number of wave functions and thus encode as much information as you like in a single computation. The entire history of science, and of quantum mechanics itself makes me doubt that this is so—I'll bet that we eventually find some inherent granularity in the precision of the wave function (perhaps round-off errors in the simulation we're living within, but let's not revisit that). This is not to say, nor do I mean to imply, that quantum computation will not work; indeed, it has already been demonstrated in proof of concept laboratory experiments, and it may well hold the potential of extending the growth of computational power after the pure scaling of classical computers runs into physical limits. But just as shrinking semiconductor devices is fundamentally constrained by the size of atoms, quantum computation may be limited by the ultimate precision of the discrete computational substrate of the universe which behaves, on the large scale, like a continuous wave function.
Posted at July 21, 2006 23:32