June 2006

Woit, Peter. Not Even Wrong. London: Jonathan Cape, 2006. ISBN 0-224-07605-1.
Richard Feynman, a man about as difficult to bamboozle on scientific topics as any who ever lived, remarked in an interview (p. 180) in 1987, a year before his death:
…I think all this superstring stuff is crazy and it is in the wrong direction. … I don't like that they're not calculating anything. I don't like that they don't check their ideas. I don't like that for anything that disagrees with an experiment, they cook up an explanation—a fix-up to say “Well, it still might be true.”
Feynman was careful to hedge his remark as being that of an elder statesman of science, who collectively have a history of foolishly considering the speculations of younger researchers to be nonsense, and he would have almost certainly have opposed any effort to cut off funding for superstring research, as it might be right, after all, and should be pursued in parallel with other promising avenues until they make predictions which can be tested by experiment, falsifying and leading to the exclusion of those candidate theories whose predictions are incorrect.

One wonders, however, what Feynman's reaction would have been had he lived to contemplate the contemporary scene in high energy theoretical physics almost twenty years later. String theory and its progeny still have yet to make a single, falsifiable prediction which can be tested by a physically plausible experiment. This isn't surprising, because after decades of work and tens of thousands of scientific publications, nobody really knows, precisely, what superstring (or M, or whatever) theory really is; there is no equation, or set of equations from which one can draw physical predictions. Leonard Susskind, a co-founder of string theory, observes ironically in his book The Cosmic Landscape (March 2006), “On this score, one might facetiously say that String Theory is the ultimate epitome of elegance. With all the years that String Theory has been studied, no one has ever found a single defining equation! The number at present count is zero. We know neither what the fundamental equations of the theory are or even if it has any.” (p. 204). String theory might best be described as the belief that a physically correct theory exists and may eventually be discovered by the research programme conducted under that name.

From the time Feynman spoke through the 1990s, the goal toward which string theorists were working was well-defined: to find a fundamental theory which reproduces at the low energy limit the successful results of the standard model of particle physics, and explains, from first principles, the values of the many (there are various ways to count them, slightly different—the author gives the number as 18 in this work) free parameters of that theory, whose values are not predicted by any theory and must be filled in by experiment. Disturbingly, theoretical work in the early years of this century has convinced an increasing number of string theorists (but not all) that the theory (whatever it may turn out to be), will not predict a unique low energy limit (or “vacuum state”), but rather an immense “landscape” of possible universes, with estimates like 10100 and 10500 and even more bandied around (by comparison, there are only about 1080 elementary particles in the entire observable universe—a minuscule number compared to such as these). Most of these possible universes would be hideously inhospitable to intelligent life as we know and can imagine it (but our imagination may be limited), and hence it is said that the reason we find ourselves in one of the rare universes which contain galaxies, chemistry, biology, and the National Science Foundation is due to the anthropic principle: a statement, bordering on tautology, that we can only observe conditions in the universe which permit our own existence, and that perhaps either in a “multiverse” of causally disjoint or parallel realities, all the other possibilities exist as well, most devoid of observers, at least those like ourselves (triune glorgs, feeding on bare colour in universes dominated by quark-gluon plasma would doubtless deem our universe unthinkably cold, rarefied, and dead).

But adopting the “landscape” view means abandoning the quest for a theory of everything and settling for what amounts to a “theory of anything”. For even if string theorists do manage to find one of those 10100 or whatever solutions in the landscape which perfectly reproduces all the experimental results of the standard model (and note that this is something nobody has ever done and appears far out of reach, with legitimate reasons to doubt it is possible at all), then there will almost certainly be a bewildering number of virtually identical solutions with slightly different results, so that any plausible experiment which measures a quantity to more precision or discovers a previously unknown phenomenon can be accommodated within the theory simply by tuning one of its multitudinous dials and choosing a different solution which agrees with the experimental results. This is not what many of the generation who built the great intellectual edifice of the standard model of particle physics would have considered doing science.

Now if string theory were simply a chimæra being pursued by a small band of double-domed eccentrics, one wouldn't pay it much attention. Science advances by exploring lots of ideas which may seem crazy at the outset and discarding the vast majority which remain crazy after they are worked out in more detail. Whatever remains, however apparently crazy, stays in the box as long as its predictions are not falsified by experiment. It would be folly of the greatest magnitude, comparable to attempting to centrally plan the economy of a complex modern society, to try to guess in advance, by some kind of metaphysical reasoning, which ideas were worthy of exploration. The history of the S-matrix or “bootstrap” theory of the strong interactions recounted in chapter 11 is an excellent example of how science is supposed to work. A beautiful theory, accepted by a large majority of researchers in the field, which was well in accord with experiment and philosophically attractive, was almost universally abandoned in a few years after the success of the quark model in predicting new particles and the stunning deep inelastic scattering results at SLAC in the 1970s.

String theory, however, despite not having made a single testable prediction after more than thirty years of investigation, now seems to risk becoming a self-perpetuating intellectual monoculture in theoretical particle physics. Among the 22 tenured professors of theoretical physics in the leading six faculties in the United States who received their PhDs after 1981, fully twenty specialise in string theory (although a couple now work on the related brane-world models). These professors employ graduate students and postdocs who work in their area of expertise, and when a faculty position opens up, may be expected to support candidates working in fields which complement their own research. This environment creates a great incentive for talented and ambitious students aiming for one the rare permanent academic appointments in theoretical physics to themselves choose string theory, as that's where the jobs are. After a generation, this process runs the risk of operating on its own momentum, with nobody in a position to step back and admit that the entire string theory enterprise, judged by the standards of genuine science, has failed, and does not merit the huge human investment by the extraordinarily talented and dedicated people who are pursuing it, nor the public funding it presently receives. If Edward Witten believes there's something still worth pursuing, fine: his self-evident genius and massive contributions to mathematical physics more than justify supporting his work. But this enterprise which is cranking out hundreds of PhDs and postdocs who are spending their most intellectually productive years learning a fantastically complicated intellectual structure with no grounding whatsoever in experiment, most of whom will have no hope of finding permanent employment in the field they have invested so much to aspire toward, is much more difficult to justify or condone.

The problem, to state it in a manner more inflammatory than the measured tone of the author, and in a word of my choosing which I do not believe appears at all in his book, is that contemporary academic research in high energy particle theory is corrupt. As is usually the case with such corruption, the root cause is socialism, although the look-only-left blinders almost universally worn in academia today hides this from most observers there. Dwight D. Eisenhower, however, twigged to it quite early. In his farewell address of January 17th, 1961, which academic collectivists endlessly cite for its (prescient) warning about the “military-industrial complex”, he went on to say, although this is rarely quoted,

In this revolution, research has become central; it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government.

Today, the solitary inventor, tinkering in his shop, has been over shadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.

The prospect of domination of the nation's scholars by Federal employment, project allocations, and the power of money is ever present and is gravely to be regarded.

And there, of course, is precisely the source of the corruption. This enterprise of theoretical elaboration is funded by taxpayers, who have no say in how their money, taken under threat of coercion, is spent. Which researchers receive funds for what work is largely decided by the researchers themselves, acting as peer review panels. While peer review may work to vet scientific publications, as soon as money becomes involved, the disposition of which can make or break careers, all the venality and naked self- and group-interest which has undone every well-intentioned experiment in collectivism since Robert Owen comes into play, with the completely predictable and tediously repeated results. What began as an altruistic quest driven by intellectual curiosity to discover answers to the deepest questions posed by nature ends up, after a generation of grey collectivism, as a jobs program. In a sense, string theory can be thought of like that other taxpayer-funded and highly hyped program, the space shuttle, which is hideously expensive, dangerous to the careers of those involved with it (albeit in a more direct manner), supported by a standing army composed of some exceptional people and a mass of the mediocre, difficult to close down because it has carefully cultivated a constituency whose own self-interest is invested in continuation of the program, and almost completely unproductive of genuine science.

One of the author's concerns is that the increasingly apparent impending collapse of the string theory edifice may result in the de-funding of other promising areas of fundamental physics research. I suspect he may under-estimate how difficult it is to get rid of a government program, however absurd, unjustified, and wasteful it has become: consider the space shuttle, or mohair subsidies. But perhaps de-funding is precisely what is needed to eliminate the corruption. Why should U.S. taxpayers be spending on the order of thirty million dollars a year on theoretical physics not only devoid of any near- or even distant-term applications, but also mostly disconnected from experiment? Perhaps if theoretical physics returned to being funded by universities from their endowments and operating funds, and by money raised from patrons and voluntarily contributed by the public interested in the field, it would be, albeit a much smaller enterprise, a more creative and productive one. Certainly it would be more honest. Sure, there may be some theoretical breakthrough we might not find for fifty years instead of twenty with massive subsidies. But so what? The truth is out there, somewhere in spacetime, and why does it matter (since it's unlikely in the extreme to have any immediate practical consequences) how soon we find it, anyway? And who knows, it's just possible a research programme composed of the very, very best, whose work is of such obvious merit and creativity that it attracts freely-contributed funds, exploring areas chosen solely on their merit by those doing the work, and driven by curiosity instead of committee group-think, might just get there first. That's the way I'd bet.

For a book addressed to a popular audience which contains not a single equation, many readers will find it quite difficult. If you don't follow these matters in some detail, you may find some of the more technical chapters rather bewildering. (The author, to be fair, acknowledges this at the outset.) For example, if you don't know what the hierarchy problem is, or why it is important, you probably won't be able to figure it out from the discussion here. On the other hand, policy-oriented readers will have little difficulty grasping the problems with the string theory programme and its probable causes even if they skip the gnarly physics and mathematics. An entertaining discussion of some of the problems of string theory, in particular the question of “background independence”, in which the string theorists universally assume the existence of a background spacetime which general relativity seems to indicate doesn't exist, may be found in Carlo Rovelli's "A Dialog on Quantum Gravity". For more technical details, see Lee Smolin's Three Roads to Quantum Gravity. There are some remarkable factoids in this book, one of the most stunning being that the proposed TeV class muon colliders of the future will produce neutrino (yes, neutrino) radiation which is dangerous to humans off-site. I didn't believe it either, but look here—imagine the sign: “DANGER: Neutrino Beam”!

A U.S. edition is scheduled for publication at the end of September 2006. The author has operated the Not Even Wrong Web log since 2004; it is an excellent source for news and gossip on these issues. The unnamed “excitable … Harvard faculty member” mentioned on p. 227 and elsewhere is Luboš Motl (who is, however, named in the acknowledgements), and whose own Web log is always worth checking out.


Bartlett, Bruce. Impostor. New York: Doubleday, 2006. ISBN 0-385-51827-7.
This book is a relentless, uncompromising, and principled attack on the administration of George W. Bush by an author whose conservative credentials are impeccable and whose knowledge of economics and public finance is authoritative; he was executive director of the Joint Economic Committee of Congress during the Reagan administration and later served in the Reagan White House and in the Treasury Department under the first president Bush. For the last ten years he was a Senior Fellow at the National Center for Policy Analysis, which fired him in 2005 for writing this book.

Bartlett's primary interest is economics, and he focuses almost exclusively on the Bush administration's spending and tax policies here, with foreign policy, the wars in Afghanistan and Iraq, social policy, civil liberties, and other contentious issues discussed only to the extent they affect the budget. The first chapter, titled “I Know Conservatives, and George W. Bush Is No Conservative” states the central thesis, which is documented by detailed analysis of the collapse of the policy-making process in Washington, the expensive and largely ineffective tax cuts, the ruinous Medicare prescription drug program (and the shameful way in which its known costs were covered up while the bill was rammed through Congress), the abandonment of free trade whenever there were votes to be bought, the explosion in regulation, and the pork-packed spending frenzy in the Republican controlled House and Senate which Bush has done nothing to restrain (he is the first president since John Quincy Adams to serve a full four year term and never veto a single piece of legislation). All of this is documented in almost 80 pages of notes and source references.

Bartlett is a “process” person as well as a policy wonk, and he diagnoses the roots of many of the problems as due to the Bush White House's resembling a third and fourth Nixon administration. There is the same desire for secrecy, the intense value placed on personal loyalty, the suppression of active debate in favour of a unified line, isolation from outside information and opinion, an attempt to run everything out of the White House, bypassing the policy shops and resources in the executive departments, and the paranoia induced by uniformly hostile press coverage and detestation by intellectual elites. Also Nixonesque is the free-spending attempt to buy the votes, at whatever the cost or long-term consequences, of members of groups who are unlikely in the extreme to reward Republicans for their largesse because they believe they'll always get a better deal from the Democrats.

The author concludes that the inevitable economic legacy of the Bush presidency will be large tax increases in the future, perhaps not on Bush's watch, but correctly identified as the consequences of his irresponsibility when they do come to pass. He argues that the adoption of a European-style value-added tax (VAT) is the “least bad” way to pay the bill when it comes due. The long-term damage done to conservatism and the Republican party are assessed, along with prospects for the post-Bush era.

While Bartlett was one of the first prominent conservatives to speak out against Bush, he is hardly alone today, with disgruntlement on the right seemingly restrained mostly due to lack of alternatives. And that raises a question on which this book is silent: if Bush has governed (at least in domestic economic policy) irresponsibly, incompetently, and at variance with conservative principles, what other potential candidate could have been elected instead who would have been the true heir of the Reagan legacy? Al Gore? John Kerry? John McCain? Steve Forbes? What plausible candidate in either party seems inclined and capable of turning things around instead of making them even worse? The irony, and a fundamental flaw of Empire seems to be that empires don't produce the kind of leaders which built them, or are required to avert their decline. It's fundamentally a matter of crunchiness and sogginess, and it's why empires don't last forever.


Ortega y Gasset, Josť. The Revolt of the Masses. New York: W. W. Norton, [1930, 1932, 1964] 1993. ISBN 0-393-31095-7.
This book, published more than seventy-five years ago, when the twentieth century was only three decades old, is a simply breathtaking diagnosis of the crises that manifested themselves in that century and the prognosis for human civilisation. The book was published in Spanish in 1930; this English translation, authorised and approved by the author, by a translator who requested to remain anonymous, first appeared in 1932 and has been in print ever since.

I have encountered few works so short (just 190 pages), which are so densely packed with enlightening observations and thought-provoking ideas. When I read a book, if I encounter a paragraph that I find striking, either in the writing or the idea it embodies, I usually add it to my “quotes” archive for future reference. If I did so with this book, I would find myself typing in a large portion of the entire text. This is not an easy read, not due to the quality of the writing and translation (which are excellent), nor the complexity of the concepts and arguments therein, but simply due to the pure number of insights packed in here, each of which makes you stop and ponder its derivation and implications.

The essential theme of the argument anticipated the crunchy/soggy analysis of society by more than 65 years. In brief, over-achieving self-motivated elites create liberal democracy and industrial economies. Liberal democracy and industry lead to the emergence of the “mass man”, self-defined as not of the elite and hostile to existing elite groups and institutions. The mass man, by strength of numbers and through the democratic institutions which enabled his emergence, seizes the levers of power and begins to use the State to gratify his immediate desires. But, unlike the elites who created the State, the mass man does not think or plan in the long term, and is disinclined to make the investments and sacrifices which were required to create the civilisation in the first place, and remain necessary if it is to survive. In this consists the crisis of civilisation, and grasping this single concept explains much of the history of the seven decades which followed the appearance of the book and events today. Suddenly some otherwise puzzling things start to come into focus, such as why it is, in a world enormously more wealthy than that of the nineteenth century, with abundant and well-educated human resources and technological capabilities which dwarf those of that epoch, there seems to be so little ambition to undertake large-scale projects, and why those which are embarked upon are so often bungled.

In a single footnote on p. 119, Ortega y Gasset explains what the brilliant Hans-Hermann Hoppe spent an entire book doing: why hereditary monarchies, whatever their problems, are usually better stewards of the national patrimony than democratically elected leaders. In pp. 172–186 he explains the curious drive toward European integration which has motivated conquerors from Napoleon through Hitler, and collectivist bureaucratic schemes such as the late, unlamented Soviet Union and the odious present-day European Union. On pp. 188–190 he explains why a cult of youth emerges in mass societies, and why they produce as citizens people who behave like self-indulgent perpetual adolescents. In another little single-sentence footnote on p. 175 he envisions the disintegration of the British Empire, then at its zenith, and the cultural fragmentation of the post-colonial states. I'm sure that few of the author's intellectual contemporaries could have imagined their descendants living among the achievements of Western civilisation yet largely ignorant of its history or cultural heritage; the author nails it in chapters 9–11, explaining why it was inevitable and tracing the consequences for the civilisation, then in chapter 12 he forecasts the fragmentation of science into hyper-specialised fields and the implications of that. On pp. 184–186 he explains the strange attraction of Soviet communism for European intellectuals who otherwise thought themselves individualists—recall, this is but six years after the death of Lenin. And still there is more…and more…and more. This is a book you can probably re-read every year for five years in a row and get something more out of it every time.

A full-text online edition is available, which is odd since the copyright of the English translation was last renewed in 1960 and should still be in effect, yet the site which hosts this edition claims that all their content is in the public domain.


Weinberger, Sharon. Imaginary Weapons. New York: Nation Books, 2006. ISBN 1-56025-849-7.

A nuclear isomer is an atomic nucleus which, due to having a greater spin, different shape, or differing alignment of the spin orientation and axis of symmetry, has more internal energy than the ground state nucleus with the same number of protons and neutrons. Nuclear isomers are usually produced in nuclear fusion reactions when the the addition of protons and/or neutrons to a nucleus in a high-energy collision leaves it in an excited state. Hundreds of nuclear isomers are known, but the overwhelming majority decay with gamma ray emission in about 10−14 seconds. In a few species, however, this almost instantaneous decay is suppressed for various reasons, and metastable isomers exist with half-lives ranging from 10−9 seconds (one nanosecond), to the isomer Tantalum-180m, which has a half-life of at least 1015 years and may be entirely stable; it is the only nuclear isomer found in nature and accounts for about one atom of 8300 in tantalum metal.

Some metastable isomers with intermediate half-lives have a remarkably large energy compared to the ground state and emit correspondingly energetic gamma ray photons when they decay. The Hafnium-178m2 (the “m2” denotes the second lowest energy isomeric state) nucleus has a half-life of 31 years and decays (through the m1 state) with the emission of 2.45 MeV in gamma rays. Now the fact that there's a lot of energy packed into a radioactive nucleus is nothing new—people were calculating the energy of disintegrating radium and uranium nuclei at the end of the 19th century, but all that energy can't be used for much unless you can figure out some way to release it on demand—as long as it just dribbles out at random, you can use it for some physics experiments and medical applications, but not to make loud bangs or turn turbines. It was only the discovery of the fission chain reaction, where the fission of certain nuclei liberates neutrons which trigger the disintegration of others in an exponential process, which made nuclear energy, for better or for worse, accessible.

So, as long as there is no way to trigger the release of the energy stored in a nuclear isomer, it is nothing more than an odd kind of radioactive element, the subject of a reasonably well-understood and somewhat boring topic in nuclear physics. If, however, there were some way to externally trigger the decay of the isomer to the ground state, then the way would be open to releasing the energy in the isomer at will. It is possible to trigger the decay of the Tantalum-180 isomer by 2.8 MeV photons, but the energy required to trigger the decay is vastly greater than the 0.075 MeV it releases, so the process is simply an extremely complicated and expensive way to waste energy.

Researchers in the small community interested in nuclear isomers were stunned when, in the January 25, 1999 issue of Physical Review Letters, a paper by Carl Collins and his colleagues at the University of Texas at Dallas reported they had triggered the release of 2.45 MeV in gamma rays from a sample of Hafnium-178m2 by irradiating it with a second-hand dental X-ray machine with the sample of the isomer sitting on a styrofoam cup. Their report implied, even with the crude apparatus, an energy gain of sixty times break-even, which was more than a million times the rate predicted by nuclear theory, if triggering were possible at all. The result, if real, could have substantial technological consequences: the isomer could be used as a nuclear battery, which could store energy and release it on demand with a density which dwarfed that of any chemical battery and was only a couple of orders of magnitude less than a fission bomb. And, speaking of bombs, if you could manage to trigger a mass of hafnium all at once or arrange for it to self-trigger in a chain reaction, you could make a variety of nifty weapons out of it, including a nuclear hand grenade with a yield of two kilotons. You could also build a fission-free trigger for a thermonuclear bomb which would evade all of the existing nonproliferation safeguards which are aimed at controlling access to fissile material. These are the kind of things that get the attention of folks in that big five-sided building in Arlington, Virginia.

And so it came to pass, in a Pentagon bent on “transformational technologies” and concerned with emerging threats from potential adversaries, that in May of 2003 a Hafnium Isomer Production Panel (HIPP) was assembled to draw up plans for bulk production of the substance, with visions of nuclear hand grenades, clean bunker-busting fusion bombs, and even hafnium-powered bombers floating before the eyes of the out of the box thinkers at DARPA, who envisioned a two-year budget of USD30 million for the project—military science marches into the future. What's wrong with this picture? Well, actually rather a lot of things.

  • No other researcher had been able to reproduce the results from the original experiment. This included a team of senior experimentalists who used the Advanced Photon Source at Argonne National Laboratory and state of the art instrumentation and found no evidence whatsoever for triggering of the hafnium isomer with X-rays—in two separate experiments.
  • As noted above, well-understood nuclear theory predicted the yield from triggering, if it occurred, to be six orders of magnitude less than reported in Collins's paper.
  • An evaluation of the original experiment by the independent JASON group of senior experts in 1999 determined the result to be “a priori implausible” and “inconclusive, at best”.
  • A separate evaluation by the Institute for Defense Analyses concluded the original paper reporting the triggering results “was flawed and should not have passed peer review”.
  • Collins had never run, and refused to run, a null experiment with ordinary hafnium to confirm that the very small effect he reported went away when the isomer was removed.
  • James Carroll, one of the co-authors of the original paper, had obtained nothing but null results in his own subsequent experiments on hafnium triggering.
  • Calculations showed that even if triggering were to be possible at the reported rate, the process would not come close to breaking even: more than six times as much X-ray energy would go in as gamma rays came out.
  • Even if triggering worked, and some way were found to turn it into an energy source or explosive device, the hafnium isomer does not occur in nature and would have to be made by a hideously inefficient process in a nuclear reactor or particle accelerator, at a cost estimated at around a billion dollars per gram. The explosive in the nuclear hand grenade would cost tens of billions of dollars, compared to which highly enriched uranium and plutonium are cheap as dirt.
  • If the material could be produced and triggering made to work, the resulting device would pose an extreme radiation hazard. Radiation is inverse to half-life, and the hafnium isomer, with a 31 year half-life, is vastly more radioactive than U-235 (700 million years) or Pu-239 (24,000 years). Further, hafnium isomer decays emit gamma rays, which are the most penetrating form of ionising nuclear radiation and the most difficult against which to shield. The shielding required to protect humans in the vicinity of a tangible quantity of hafnium isomer would more than negate its small mass and compact size.
  • A hafnium explosive device would disperse large quantities of the unreacted isomer (since a relatively small percentage of the total explosive can react before the device is disassembled in the explosion). As it turns out, the half-life of the isomer is just about the same as that of Cesium-137, which is often named as the prime candidate for a “dirty” radiological bomb. One physicist on the HIPP (p. 176) described a hafnium weapon as “the mother of all dirty bombs”.
  • And consider that hand grenade, which would weigh about five pounds. How far can you throw a five pound rock? What do you think about being that far away from a detonation with the energy of two thousand tons of TNT, all released in prompt gamma rays?

But bad science, absurd economics, a nonexistent phenomenon, damning evaluations by panels of authorities, lack of applications, and ridiculous radiation risk in the extremely improbable event of success pose no insurmountable barriers to a government project once it gets up to speed, especially one in which the relationships between those providing the funding and its recipients are complicated and unseemingly cozy. It took an exposé in the Washington Post Magazine by the author and subsequent examination in Congress to finally drive a stake through this madness—maybe. As of the end of 2005, although DARPA was out of the hafnium business (at least publicly), there were rumours of continued funding thanks to a Congressional earmark in the Department of Energy budget.

This book is a well-researched and fascinating look inside the defence underworld where fringe science feeds on federal funds, and starkly demonstrates how weird and wasteful things can get when Pentagon bureaucrats disregard their own science advisors and substitute instinct and wishful thinking for the tedious, but ultimately reliable, scientific method. Many aspects of the story are also quite funny, although U.S. taxpayers who footed the bill for this madness may be less amused. The author has set up a Web site for the book, and Carl Collins, who conducted the original experiment with the dental X-ray and styrofoam cup which incited the mania has responded with his own, almost identical in appearance, riposte. If you're interested in more technical detail on the controversy than appears in Weinberg's book, the Physics Today article from May 2004 is an excellent place to start. The book contains a number of typographical and factual errors, none of which are significant to the story, but when the first line of the Author's Note uses “sited” when “cited” is intended, and in the next paragraph “wondered” instead of “wandered”, you have to—wonder.

It is sobering to realise that this folly took place entirely in the public view: in the open scientific literature, university labs, unclassified defence funding subject to Congressional oversight, and ultimately in the press, and yet over a period of years millions in taxpayer funds were squandered on nonsense. Just imagine what is going on in highly-classified “black” programs.