« May 2006 | Main | July 2006 »
Monday, June 26, 2006
Reading List: Imaginary Weapons
- Weinberger, Sharon. Imaginary Weapons. New York: Nation Books, 2006. ISBN 1-56025-849-7.
-
A nuclear isomer is an atomic nucleus which, due to having a greater spin, different shape, or differing alignment of the spin orientation and axis of symmetry, has more internal energy than the ground state nucleus with the same number of protons and neutrons. Nuclear isomers are usually produced in nuclear fusion reactions when the the addition of protons and/or neutrons to a nucleus in a high-energy collision leaves it in an excited state. Hundreds of nuclear isomers are known, but the overwhelming majority decay with gamma ray emission in about 10-14 seconds. In a few species, however, this almost instantaneous decay is suppressed for various reasons, and metastable isomers exist with half-lives ranging from 10-9 seconds (one nanosecond), to the isomer Tantalum-180m, which has a half-life of at least 1015 years and may be entirely stable; it is the only nuclear isomer found in nature and accounts for about one atom of 8300 in tantalum metal.
Some metastable isomers with intermediate half-lives have a remarkably large energy compared to the ground state and emit correspondingly energetic gamma ray photons when they decay. The Hafnium-178m2 (the “m2” denotes the second lowest energy isomeric state) nucleus has a half-life of 31 years and decays (through the m1 state) with the emission of 2.45 MeV in gamma rays. Now the fact that there's a lot of energy packed into a radioactive nucleus is nothing new—people were calculating the energy of disintegrating radium and uranium nuclei at the end of the 19th century, but all that energy can't be used for much unless you can figure out some way to release it on demand—as long as it just dribbles out at random, you can use it for some physics experiments and medical applications, but not to make loud bangs or turn turbines. It was only the discovery of the fission chain reaction, where the fission of certain nuclei liberates neutrons which trigger the disintegration of others in an exponential process, which made nuclear energy, for better or for worse, accessible.
So, as long as there is no way to trigger the release of the energy stored in a nuclear isomer, it is nothing more than an odd kind of radioactive element, the subject of a reasonably well-understood and somewhat boring topic in nuclear physics. If, however, there were some way to externally trigger the decay of the isomer to the ground state, then the way would be open to releasing the energy in the isomer at will. It is possible to trigger the decay of the Tantalum-180 isomer by 2.8 MeV photons, but the energy required to trigger the decay is vastly greater than the 0.075 MeV it releases, so the process is simply an extremely complicated and expensive way to waste energy.
Researchers in the small community interested in nuclear isomers were stunned when, in the January 25, 1999 issue of Physical Review Letters, a paper by Carl Collins and his colleagues at the University of Texas at Dallas reported they had triggered the release of 2.45 MeV in gamma rays from a sample of Hafnium-178m2 by irradiating it with a second-hand dental X-ray machine with the sample of the isomer sitting on a styrofoam cup. Their report implied, even with the crude apparatus, an energy gain of sixty times break-even, which was more than a million times the rate predicted by nuclear theory, if triggering were possible at all. The result, if real, could have substantial technological consequences: the isomer could be used as a nuclear battery, which could store energy and release it on demand with a density which dwarfed that of any chemical battery and was only a couple of orders of magnitude less than a fission bomb. And, speaking of bombs, if you could manage to trigger a mass of hafnium all at once or arrange for it to self-trigger in a chain reaction, you could make a variety of nifty weapons out of it, including a nuclear hand grenade with a yield of two kilotons. You could also build a fission-free trigger for a thermonuclear bomb which would evade all of the existing nonproliferation safeguards which are aimed at controlling access to fissile material. These are the kind of things that get the attention of folks in that big five-sided building in Arlington, Virginia.
Imaginary Hafnium
Hand Grenade
Diameter: 5 inches
Yield: 2 kilotonsAnd so it came to pass, in a Pentagon bent on “transformational technologies” and concerned with emerging threats from potential adversaries, that in May of 2003 a Hafnium Isomer Production Panel (HIPP) was assembled to draw up plans for bulk production of the substance, with visions of nuclear hand grenades, clean bunker-busting fusion bombs, and even hafnium-powered bombers floating before the eyes of the out of the box thinkers at DARPA, who envisioned a two-year budget of USD30 million for the project—military science marches into the future. What's wrong with this picture? Well, actually rather a lot of things.
- No other researcher had been able to reproduce the results from the original experiment. This included a team of senior experimentalists who used the Advanced Photon Source at Argonne National Laboratory and state of the art instrumentation and found no evidence whatsoever for triggering of the hafnium isomer with X-rays—in two separate experiments.
- As noted above, well-understood nuclear theory predicted the yield from triggering, if it occurred, to be six orders of magnitude less than reported in Collins's paper.
- An evaluation of the original experiment by the independent JASON group of senior experts in 1999 determined the result to be “a priori implausible” and “inconclusive, at best”.
- A separate evaluation by the Institute for Defense Analyses concluded the original paper reporting the triggering results “was flawed and should not have passed peer review”.
- Collins had never run, and refused to run, a null experiment with ordinary hafnium to confirm that the very small effect he reported went away when the isomer was removed.
- James Carroll, one of the co-authors of the original paper, had obtained nothing but null results in his own subsequent experiments on hafnium triggering.
- Calculations showed that even if triggering were to be possible at the reported rate, the process would not come close to breaking even: more than six times as much X-ray energy would go in as gamma rays came out.
- Even if triggering worked, and some way were found to turn it into an energy source or explosive device, the hafnium isomer does not occur in nature and would have to be made by a hideously inefficient process in a nuclear reactor or particle accelerator, at a cost estimated at around a billion dollars per gram. The explosive in the nuclear hand grenade would cost tens of billions of dollars, compared to which highly enriched uranium and plutonium are cheap as dirt.
- If the material could be produced and triggering made to work, the resulting device would pose an extreme radiation hazard. Radiation is inverse to half-life, and the hafnium isomer, with a 31 year half-life, is vastly more radioactive than U-235 (700 million years) or Pu-239 (24,000 years). Further, hafnium isomer decays emit gamma rays, which are the most penetrating form of ionising nuclear radiation and the most difficult against which to shield. The shielding required to protect humans in the vicinity of a tangible quantity of hafnium isomer would more than negate its small mass and compact size.
- A hafnium explosive device would disperse large quantities of the unreacted isomer (since a relatively small percentage of the total explosive can react before the device is disassembled in the explosion). As it turns out, the half-life of the isomer is just about the same as that of Cesium-137, which is often named as the prime candidate for a “dirty” radiological bomb. One physicist on the HIPP (p. 176) described a hafnium weapon as “the mother of all dirty bombs”.
- And consider that hand grenade, which would weigh about five pounds. How far can you throw a five pound rock? What do you think about being that far away from a detonation with the energy of two thousand tons of TNT, all released in prompt gamma rays?
But bad science, absurd economics, a nonexistent phenomenon, damning evaluations by panels of authorities, lack of applications, and ridiculous radiation risk in the extremely improbable event of success pose no insurmountable barriers to a government project once it gets up to speed, especially one in which the relationships between those providing the funding and its recipients are complicated and unseemingly cozy. It took an exposé in the Washington Post Magazine by the author and subsequent examination in Congress to finally drive a stake through this madness—maybe. As of the end of 2005, although DARPA was out of the hafnium business (at least publicly), there were rumours of continued funding thanks to a Congressional earmark in the Department of Energy budget.
This book is a well-researched and fascinating look inside the defence underworld where fringe science feeds on federal funds, and starkly demonstrates how weird and wasteful things can get when Pentagon bureaucrats disregard their own science advisors and substitute instinct and wishful thinking for the tedious, but ultimately reliable, scientific method. Many aspects of the story are also quite funny, although U.S. taxpayers who footed the bill for this madness may be less amused. The author has set up a Web site for the book, and Carl Collins, who conducted the original experiment with the dental X-ray and styrofoam cup which incited the mania has responded with his own, almost identical in appearance, riposte. If you're interested in more technical detail on the controversy than appears in Weinberg's book, the Physics Today article from May 2004 is an excellent place to start. The book contains a number of typographical and factual errors, none of which are significant to the story, but when the first line of the Author's Note uses “sited” when “cited” is intended, and in the next paragraph “wondered” instead of “wandered”, you have to—wonder.
It is sobering to realise that this folly took place entirely in the public view: in the open scientific literature, university labs, unclassified defence funding subject to Congressional oversight, and ultimately in the press, and yet over a period of years millions in taxpayer funds were squandered on nonsense. Just imagine what is going on in highly-classified “black” programs.
Sunday, June 25, 2006
HTML: Tiled Background Image Alignment
I'm currently preparing a new on-line book for the Fourmilab archives (it will be announced here when it's ready) in which I decided to use a background image to highlight certain passages of text. I defined the style for such text with a CSS specification like the following:.inveigh { background-color: #FFEEB3; background-image: url(bgtile.png); color: inherit; }
(Specifying a “background-color” as well as the “background-image” causes the text to be highlighted in that solid colour even if the user has disabled the display of images.) For the background, I drew an image which would “tile” both vertically and horizontally, such as the one below:
and then I applied the style to the body of a paragraph with HTML code like this;
<p> <span class="inveigh"> Why dost thou converse . . . </span> </p>
When I viewed the page with Mozilla Firefox (version 1.5.0.4 on Linux), I was astonished and dismayed to see a background like that below, which I'm including as a screen grab image to avoid browser compatibility issues we'll get to later. (The quote used in this example is from Prince Henry's conversation with Falstaff in Act 2, Scene 4 of Shakespeare's Henry IV, Part I.)
Yuck! Instead of the two-way cross-hatching you'd expect, the result looks like chevrons which don't even align from one line to the next. Stranger still, if I resized the browser window, the alignment would shift depending upon the width of the window. After verifying that the background image did, indeed, tile properly and trying lots of things such as different image sizes (my original images weren't a power of two in size, and I suspected that might be a problem), I also viewed the page with version 9.00 of the Opera browser, which is known for its CSS standard compliance, just to rule out the possibility of an obscure bug in Firefox or the Gecko rendering engine it uses, but Opera displayed the page with the same weird chevron effect.
After what would have been an extended bout of hair-pulling, had I enough hair remaining to pull, I discovered that Firefox and Opera render a background image differently depending upon the hierarchical level of the element to which the style is applied. Apparently, if you apply the style to an inline element such as “<span>”, the image is tiled on a line-by-line basis, but not vertically, but if the style be applied to a block-level element such as “<div>”, then it is tiled both vertically and horizontally to fill the entire box the element occupies. Changing the HTML to:
<div class="inveigh"> <p> Why dost thou converse . . . </p> </div>
results in the following display of the paragraph. (Note that due to nesting rules, at least in Strict XHTML 1.0, we must move the paragraph container inside the division, as opposed to the inline span, which is used within a paragraph.)
Ahhhh…sweet planar tiling! Note that the background also now fills the entire box containing the paragraph, instead of extending only to the end of each ragged right line when applied using “<span>”.
Now if you're a grizzled Web page developer (which you probably are, if you've read this far), the question on the tip of your tongue is, of course, “What does Exploder do?”. Well, as is so often the case, Microsoft Internet Explorer, in both versions 6.0 and 7.0, behaves differently from the other browsers. In this case, it renders the background image tiled both vertically and horizontally, regardless of whether it is applied to an inline span or a division container, although the right margin continues to reflect the difference. This means that pages which use tiled backgrounds applied with “<span>” tags and tested only with Internet Explorer will “break” when viewed with other browsers.
If you'd like to test this phenomenon with your own browser, please visit this document which contains both examples shown above. If you observe any curious and/or interesting behaviour with other browsers, let me know with the feedback button and I'll report it here.
Update: Reader Lindsey observes that the “<span>” tag can be made to behave as a block-level element by adding a “display: block;” property to the style definition applied to the span. This works fine in Firefox, Opera, and Internet Explorer. Declaring the span to be a block-level element also causes the background to fill the box instead of appearing ragged right. I have added an example of such a declaration to the example document. Still, as Lindsey notes, it's better to apply the background style to an enclosing container which is a block-level element to begin with instead of forcing the browser to change the interpretation of an inline element to behave as a block. (2006-06-26 12:24 UTC)
Tuesday, June 20, 2006
Reading List: The Revolt of the Masses
- Ortega y Gasset, José. The Revolt of the Masses. New York: W. W. Norton, [1930, 1932, 1964] 1993. ISBN 0-393-31095-7.
- This book, published more than seventy-five years ago, when the twentieth century was only three decades old, is a simply breathtaking diagnosis of the crises that manifested themselves in that century and the prognosis for human civilisation. The book was published in Spanish in 1930; this English translation, authorised and approved by the author, by a translator who requested to remain anonymous, first appeared in 1932 and has been in print ever since. I have encountered few works so short (just 190 pages), which are so densely packed with enlightening observations and thought-provoking ideas. When I read a book, if I encounter a paragraph that I find striking, either in the writing or the idea it embodies, I usually add it to my “quotes” archive for future reference. If I did so with this book, I would find myself typing in a large portion of the entire text. This is not an easy read, not due to the quality of the writing and translation (which are excellent), nor the complexity of the concepts and arguments therein, but simply due to the pure number of insights packed in here, each of which makes you stop and ponder its derivation and implications. The essential theme of the argument anticipated the crunchy/soggy analysis of society by more than 65 years. In brief, over-achieving self-motivated elites create liberal democracy and industrial economies. Liberal democracy and industry lead to the emergence of the “mass man”, self-defined as not of the elite and hostile to existing elite groups and institutions. The mass man, by strength of numbers and through the democratic institutions which enabled his emergence, seizes the levers of power and begins to use the State to gratify his immediate desires. But, unlike the elites who created the State, the mass man does not think or plan in the long term, and is disinclined to make the investments and sacrifices which were required to create the civilisation in the first place, and remain necessary if it is to survive. In this consists the crisis of civilisation, and grasping this single concept explains much of the history of the seven decades which followed the appearance of the book and events today. Suddenly some otherwise puzzling things start to come into focus, such as why it is, in a world enormously more wealthy than that of the nineteenth century, with abundant and well-educated human resources and technological capabilities which dwarf those of that epoch, there seems to be so little ambition to undertake large-scale projects, and why those which are embarked upon are so often bungled. In a single footnote on p. 119, Ortega y Gasset explains what the brilliant Hans-Hermann Hoppe spent an entire book doing: why hereditary monarchies, whatever their problems, are usually better stewards of the national patrimony than democratically elected leaders. In pp. 172–186 he explains the curious drive toward European integration which has motivated conquerors from Napoleon through Hitler, and collectivist bureaucratic schemes such as the late, unlamented Soviet Union and the odious present-day European Union. On pp. 188–190 he explains why a cult of youth emerges in mass societies, and why they produce as citizens people who behave like self-indulgent perpetual adolescents. In another little single-sentence footnote on p. 175 he envisions the disintegration of the British Empire, then at its zenith, and the cultural fragmentation of the post-colonial states. I'm sure that few of the author's intellectual contemporaries could have imagined their descendants living among the achievements of Western civilisation yet largely ignorant of its history or cultural heritage; the author nails it in chapters 9–11, explaining why it was inevitable and tracing the consequences for the civilisation, then in chapter 12 he forecasts the fragmentation of science into hyper-specialised fields and the implications of that. On pp. 184–186 he explains the strange attraction of Soviet communism for European intellectuals who otherwise thought themselves individualists—recall, this is but six years after the death of Lenin. And still there is more…and more…and more. This is a book you can probably re-read every year for five years in a row and get something more out of it every time. A full-text online edition is available, which is odd since the copyright of the English translation was last renewed in 1960 and should still be in effect, yet the site which hosts this edition claims that all their content is in the public domain.
Friday, June 16, 2006
New Serial Feature: 1903 Book Catalogue
Today marks the start of a new Fourmilab serial feature which will run for the next couple of months until the entire 1903 book catalogue of the Frederick J. Drake & Company of Chicago is on-line. This catalogue appears in the back of one of their books that bears a copyright date of 1903 and two of the books listed in it are described as New or Latest editions for 1903 and 1904, so I presume it dates from one of those years. Just reading the descriptions of the books immerses you in a world long gone and largely forgotten, when humour trod on topics forbidden today, home handymen consulted 560 page two-volume works on the use of the steel square, and the infatuated would turn to North's Book of Love Letters for help in winning the heart of their dearly beloved. Teddy Roosevelt was president of the 45 United States, the Wright brothers were fiddling around with a curious contraption on a sand dune in North Carolina, and the dollar was as good as gold: 50 cents would buy you a hardbound book of 200 pages, shipped postpaid to almost anywhere in the civilised world, which wasn't ashamed to so describe itself. The pages are presented as JPEG images, scanned at 600 DPI and scaled to a uniform height of 768 pixels. The quality of the original printing isn't all that good—the ink blots and occasional missing letters were there from the start; there is almost no degradation due to age apart from yellowing of the paper (which isn't evident in these greyscale images). Special thanks to Bill Walker for discovering and providing the source document for this project.Monday, June 12, 2006
Reading List: Impostor
- Bartlett, Bruce. Impostor. New York: Doubleday, 2006. ISBN 0-385-51827-7.
- This book is a relentless, uncompromising, and principled attack on the administration of George W. Bush by an author whose conservative credentials are impeccable and whose knowledge of economics and public finance is authoritative; he was executive director of the Joint Economic Committee of Congress during the Reagan administration and later served in the Reagan White House and in the Treasury Department under the first president Bush. For the last ten years he was a Senior Fellow at the National Center for Policy Analysis, which fired him in 2005 for writing this book. Bartlett's primary interest is economics, and he focuses almost exclusively on the Bush administration's spending and tax policies here, with foreign policy, the wars in Afghanistan and Iraq, social policy, civil liberties, and other contentious issues discussed only to the extent they affect the budget. The first chapter, titled “I Know Conservatives, and George W. Bush Is No Conservative” states the central thesis, which is documented by detailed analysis of the collapse of the policy-making process in Washington, the expensive and largely ineffective tax cuts, the ruinous Medicare prescription drug program (and the shameful way in which its known costs were covered up while the bill was rammed through Congress), the abandonment of free trade whenever there were votes to be bought, the explosion in regulation, and the pork-packed spending frenzy in the Republican controlled House and Senate which Bush has done nothing to restrain (he is the first president since John Quincy Adams to serve a full four year term and never veto a single piece of legislation). All of this is documented in almost 80 pages of notes and source references. Bartlett is a “process” person as well as a policy wonk, and he diagnoses the roots of many of the problems as due to the Bush White House's resembling a third and fourth Nixon administration. There is the same desire for secrecy, the intense value placed on personal loyalty, the suppression of active debate in favour of a unified line, isolation from outside information and opinion, an attempt to run everything out of the White House, bypassing the policy shops and resources in the executive departments, and the paranoia induced by uniformly hostile press coverage and detestation by intellectual elites. Also Nixonesque is the free-spending attempt to buy the votes, at whatever the cost or long-term consequences, of members of groups who are unlikely in the extreme to reward Republicans for their largesse because they believe they'll always get a better deal from the Democrats. The author concludes that the inevitable economic legacy of the Bush presidency will be large tax increases in the future, perhaps not on Bush's watch, but correctly identified as the consequences of his irresponsibility when they do come to pass. He argues that the adoption of a European-style value-added tax (VAT) is the “least bad” way to pay the bill when it comes due. The long-term damage done to conservatism and the Republican party are assessed, along with prospects for the post-Bush era. While Bartlett was one of the first prominent conservatives to speak out against Bush, he is hardly alone today, with disgruntlement on the right seemingly restrained mostly due to lack of alternatives. And that raises a question on which this book is silent: if Bush has governed (at least in domestic economic policy) irresponsibly, incompetently, and at variance with conservative principles, what other potential candidate could have been elected instead who would have been the true heir of the Reagan legacy? Al Gore? John Kerry? John McCain? Steve Forbes? What plausible candidate in either party seems inclined and capable of turning things around instead of making them even worse? The irony, and a fundamental flaw of Empire seems to be that empires don't produce the kind of leaders which built them, or are required to avert their decline. It's fundamentally a matter of crunchiness and sogginess, and it's why empires don't last forever.
SubMarie's: Course Correction, Further Experimentation
When last I wrote of my quest to reproduce Marie's Blue Cheese salad dressing, I noted that direct comparison of the commercial product with the most recent iteration of my replica recipe had led me to conclude that the Roquefort cheese I was using was significantly more strongly flavoured and salty than the blue cheese in Marie's, and that experiments with other kinds of blue cheese were in order to try to come closer to the bull's eye. A few days later, I was reading up on blue cheeses in Steven Jenkins's Cheese Primer, where he observes (p. 155), “I have never cottoned to the practice of blending Roquefort into a dressing for salad. For this purpose, any blue cheese, such as Danish Blue, will do, and at one-third to one-fourth of the price. The deep, full, spicy round flavor of Roquefort is denigrated when used in this manner. It deserves solo billing alongside a salad; then, both tastes are elevated, rather than diminished.” Well, if you live in Switzerland, one thing you should never do is denigrate cheese, even if it comes from across the border! So, I decided to try a different kind of blue cheese in the next batch of SubMarie's, and picked up a variety when next I visited the grocery store, including the recommended Danish Blue (“Bleu Danois”—this was sold by the cut at the cheese counter and no brand name was in evidence, but according to Jenkins it is mass produced and generally consistent in quality), St. Agur, and Bleu de Bresse. I tasted these (and a few others, which weren't even close), and decided the Danish Blue was the closest to the blue cheese used in Marie's, with the St. Agur in second place, but closer to Roquefort than the Danish Blue. Based on my taste testing to date, I decided that from now on, my goal would be to reproduce Marie's “Super” recipe instead of the original “Chunky” because, having the opportunity to make a direct comparison, there's no question that the Super, with 25% more blue cheese according to the label (yet 10 fewer calories and 2 g less fat per serving—blue cheese, sinful as it is may be, still finishes second on the express lane to the afterlife compared to mayonnaise!) is without the slightest doubt the better salad dressing. Not only are there fewer calories in the Super, since it's more strongly flavoured, you may end up using less of it to obtain the same blue cheese bite on your salad.Sub Marie's: Attempt 6 | |
---|---|
Danish Blue cheese | 100 g |
Sour cream | 4 tbsp / 60 ml |
Buttermilk | 4 tbsp / 60 ml |
Mayonnaise | 10 tbsp / 150 ml |
White vinegar | 1/2 tsp / 2.5 ml |
Salt | 1/8 tsp |
Mustard powder | 1/4 tsp |
Garlic powder | 1/2 tsp |
- Marie's seems to have slightly more buttermilk bite than my recipe. I will increase the buttermilk from 4 tbsp to 6 tbsp in the next “build”. This will restore buttermilk as the second largest ingredient in the recipe, in accordance with the list on the Marie's package, and also reduce the viscosity, which I was previously trying to maximise after an initial bout of runniness, but in which I may have overshot the goal in the latest rounds.
- My recipe still seems slightly more salty. I will delete the small amount (1/8 tsp) of added salt the next time.
- I will reduce the quantity of ground mustard to 1/8 tsp. I doubt anybody can tell the difference, but I suspect the buttermilk would benefit from less competition.
I suspect that if you want to approximate the original “Chunky” recipe, you can make up a batch of my emulation of the “Super” and give it a good squirt of mayonnaise (and maybe some sour cream) to increase the volume by about 20%, and you'll end up with something close. I haven't tried this, but I shall eventually, as long as my comparison sample of the Chunky doesn't go bad until I get around to it.
Friday, June 9, 2006
Puzzle: "Outsiders" Elected U.S. Presidents
After listening to the Instapundit podcast with Michael Barone which discussed, among other things, whether a third-party candidacy for the U.S. presidency was viable in the current era, and reading Peggy Noonan's column on the same topic a couple of weeks later, it seems to me, even as an outside observer, that there is a growing sense in the U.S. not only that things are running off the rails, but that the political system, which increasingly seems to have become polarised into a Party of Corruption and a Party of Moonbats, is unlikely to offer up candidates capable of doing anything sensible about the situation before the impending train wreck. This causes people to think about the possibility of an insurgency led by an outsider, in the mold (but less nutty, and without the funny ears) of Ross Perot's candidacy in 1992. But the lesson of history to date has been that Americans don't elect outsiders as Presidents, which leads to today's puzzle, which I will pose in three stages of increasing difficulty, only revealing the next after you display the answer to the last. If I wrote out all the questions at once, some of the later ones would give away or provide hints to previous questions. We'll start with a really easy one.Who was the last U.S. president with no prior experience in elective office?Come on—you shouldn't have to wrack your brain too hard for this one!
Thursday, June 8, 2006
Reading List: Not Even Wrong
- Woit, Peter. Not Even Wrong. London: Jonathan Cape, 2006. ISBN 0-224-07605-1.
-
Richard Feynman, a man about as difficult to bamboozle on
scientific topics as any who ever lived, remarked
in an interview (p. 180) in 1987, a year before his death:
…I think all this superstring stuff is crazy and it is in the wrong direction. … I don't like that they're not calculating anything. I don't like that they don't check their ideas. I don't like that for anything that disagrees with an experiment, they cook up an explanation—a fix-up to say “Well, it still might be true.”
Feynman was careful to hedge his remark as being that of an elder statesman of science, who collectively have a history of foolishly considering the speculations of younger researchers to be nonsense, and he would have almost certainly have opposed any effort to cut off funding for superstring research, as it might be right, after all, and should be pursued in parallel with other promising avenues until they make predictions which can be tested by experiment, falsifying and leading to the exclusion of those candidate theories whose predictions are incorrect. One wonders, however, what Feynman's reaction would have been had he lived to contemplate the contemporary scene in high energy theoretical physics almost twenty years later. String theory and its progeny still have yet to make a single, falsifiable prediction which can be tested by a physically plausible experiment. This isn't surprising, because after decades of work and tens of thousands of scientific publications, nobody really knows, precisely, what superstring (or M, or whatever) theory really is; there is no equation, or set of equations from which one can draw physical predictions. Leonard Susskind, a co-founder of string theory, observes ironically in his book The Cosmic Landscape, “On this score, one might facetiously say that String Theory is the ultimate epitome of elegance. With all the years that String Theory has been studied, no one has ever found a single defining equation! The number at present count is zero. We know neither what the fundamental equations of the theory are or even if it has any.” (p. 204). String theory might best be described as the belief that a physically correct theory exists and may eventually be discovered by the research programme conducted under that name.From the time Feynman spoke through the 1990s, the goal toward which string theorists were working was well-defined: to find a fundamental theory which reproduces at the low energy limit the successful results of the standard model of particle physics, and explains, from first principles, the values of the many (there are various ways to count them, slightly different—the author gives the number as 18 in this work) free parameters of that theory, whose values are not predicted by any theory and must be filled in by experiment. Disturbingly, theoretical work in the early years of this century has convinced an increasing number of string theorists (but not all) that the theory (whatever it may turn out to be), will not predict a unique low energy limit (or “vacuum state”), but rather an immense “landscape” of possible universes, with estimates like 10100 and 10500 and even more bandied around (by comparison, there are only about 1080 elementary particles in the entire observable universe—a minuscule number compared to such as these). Most of these possible universes would be hideously inhospitable to intelligent life as we know and can imagine it (but our imagination may be limited), and hence it is said that the reason we find ourselves in one of the rare universes which contain galaxies, chemistry, biology, and the National Science Foundation is due to the anthropic principle: a statement, bordering on tautology, that we can only observe conditions in the universe which permit our own existence, and that perhaps either in a “multiverse” of causally disjoint or parallel realities, all the other possibilities exist as well, most devoid of observers, at least those like ourselves (triune glorgs, feeding on bare colour in universes dominated by quark-gluon plasma would doubtless deem our universe unthinkably cold, rarefied, and dead).
But adopting the “landscape” view means abandoning the quest for a theory of everything and settling for what amounts to a “theory of anything”. For even if string theorists do manage to find one of those 10100 or whatever solutions in the landscape which perfectly reproduces all the experimental results of the standard model (and note that this is something nobody has ever done and appears far out of reach, with legitimate reasons to doubt it is possible at all), then there will almost certainly be a bewildering number of virtually identical solutions with slightly different results, so that any plausible experiment which measures a quantity to more precision or discovers a previously unknown phenomenon can be accommodated within the theory simply by tuning one of its multitudinous dials and choosing a different solution which agrees with the experimental results. This is not what many of the generation who built the great intellectual edifice of the standard model of particle physics would have considered doing science.
Now if string theory were simply a chimæra being pursued by a small band of double-domed eccentrics, one wouldn't pay it much attention. Science advances by exploring lots of ideas which may seem crazy at the outset and discarding the vast majority which remain crazy after they are worked out in more detail. Whatever remains, however apparently crazy, stays in the box as long as its predictions are not falsified by experiment. It would be folly of the greatest magnitude, comparable to attempting to centrally plan the economy of a complex modern society, to try to guess in advance, by some kind of metaphysical reasoning, which ideas were worthy of exploration. The history of the S-matrix or “bootstrap” theory of the strong interactions recounted in chapter 11 is an excellent example of how science is supposed to work. A beautiful theory, accepted by a large majority of researchers in the field, which was well in accord with experiment and philosophically attractive, was almost universally abandoned in a few years after the success of the quark model in predicting new particles and the stunning deep inelastic scattering results at SLAC in the 1970s. String theory, however, despite not having made a single testable prediction after more than thirty years of investigation, now seems to risk becoming a self-perpetuating intellectual monoculture in theoretical particle physics. Among the 22 tenured professors of theoretical physics in the leading six faculties in the United States who received their PhDs after 1981, fully twenty specialise in string theory (although a couple now work on the related brane-world models). These professors employ graduate students and postdocs who work in their area of expertise, and when a faculty position opens up, may be expected to support candidates working in fields which complement their own research. This environment creates a great incentive for talented and ambitious students aiming for one the rare permanent academic appointments in theoretical physics to themselves choose string theory, as that's where the jobs are. After a generation, this process runs the risk of operating on its own momentum, with nobody in a position to step back and admit that the entire string theory enterprise, judged by the standards of genuine science, has failed, and does not merit the huge human investment by the extraordinarily talented and dedicated people who are pursuing it, nor the public funding it presently receives. If Edward Witten believes there's something still worth pursuing, fine: his self-evident genius and massive contributions to mathematical physics more than justify supporting his work. But this enterprise which is cranking out hundreds of PhDs and postdocs who are spending their most intellectually productive years learning a fantastically complicated intellectual structure with no grounding whatsoever in experiment, most of whom will have no hope of finding permanent employment in the field they have invested so much to aspire toward, is much more difficult to justify or condone. The problem, to state it in a manner more inflammatory than the measured tone of the author, and in a word of my choosing which I do not believe appears at all in his book, is that contemporary academic research in high energy particle theory is corrupt. As is usually the case with such corruption, the root cause is socialism, although the look-only-left blinders almost universally worn in academia today hides this from most observers there. Dwight D. Eisenhower, however, twigged to it quite early. In his farewell address of January 17th, 1961, which academic collectivists endlessly cite for its (prescient) warning about the “military-industrial complex”, he went on to say, although this is rarely quoted,In this revolution, research has become central; it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government. Today, the solitary inventor, tinkering in his shop, has been over shadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers. The prospect of domination of the nation's scholars by Federal employment, project allocations, and the power of money is ever present and is gravely to be regarded.
And there, of course, is precisely the source of the corruption. This enterprise of theoretical elaboration is funded by taxpayers, who have no say in how their money, taken under threat of coercion, is spent. Which researchers receive funds for what work is largely decided by the researchers themselves, acting as peer review panels. While peer review may work to vet scientific publications, as soon as money becomes involved, the disposition of which can make or break careers, all the venality and naked self- and group-interest which has undone every well-intentioned experiment in collectivism since Robert Owen comes into play, with the completely predictable and tediously repeated results. What began as an altruistic quest driven by intellectual curiosity to discover answers to the deepest questions posed by nature ends up, after a generation of grey collectivism, as a jobs program. In a sense, string theory can be thought of like that other taxpayer-funded and highly hyped program, the space shuttle, which is hideously expensive, dangerous to the careers of those involved with it (albeit in a more direct manner), supported by a standing army composed of some exceptional people and a mass of the mediocre, difficult to close down because it has carefully cultivated a constituency whose own self-interest is invested in continuation of the program, and almost completely unproductive of genuine science. One of the author's concerns is that the increasingly apparent impending collapse of the string theory edifice may result in the de-funding of other promising areas of fundamental physics research. I suspect he may under-estimate how difficult it is to get rid of a government program, however absurd, unjustified, and wasteful it has become: consider the space shuttle, or mohair subsidies. But perhaps de-funding is precisely what is needed to eliminate the corruption. Why should U.S. taxpayers be spending on the order of thirty million dollars a year on theoretical physics not only devoid of any near- or even distant-term applications, but also mostly disconnected from experiment? Perhaps if theoretical physics returned to being funded by universities from their endowments and operating funds, and by money raised from patrons and voluntarily contributed by the public interested in the field, it would be, albeit a much smaller enterprise, a more creative and productive one. Certainly it would be more honest. Sure, there may be some theoretical breakthrough we might not find for fifty years instead of twenty with massive subsidies. But so what? The truth is out there, somewhere in spacetime, and why does it matter (since it's unlikely in the extreme to have any immediate practical consequences) how soon we find it, anyway? And who knows, it's just possible a research programme composed of the very, very best, whose work is of such obvious merit and creativity that it attracts freely-contributed funds, exploring areas chosen solely on their merit by those doing the work, and driven by curiosity instead of committee group-think, might just get there first. That's the way I'd bet. For a book addressed to a popular audience which contains not a single equation, many readers will find it quite difficult. If you don't follow these matters in some detail, you may find some of the more technical chapters rather bewildering. (The author, to be fair, acknowledges this at the outset.) For example, if you don't know what the hierarchy problem is, or why it is important, you probably won't be able to figure it out from the discussion here. On the other hand, policy-oriented readers will have little difficulty grasping the problems with the string theory programme and its probable causes even if they skip the gnarly physics and mathematics. An entertaining discussion of some of the problems of string theory, in particular the question of “background independence”, in which the string theorists universally assume the existence of a background spacetime which general relativity seems to indicate doesn't exist, may be found in Carlo Rovelli's "A Dialog on Quantum Gravity". For more technical details, see Lee Smolin's Three Roads to Quantum Gravity. There are some remarkable factoids in this book, one of the most stunning being that the proposed TeV class muon colliders of the future will produce neutrino (yes, neutrino) radiation which is dangerous to humans off-site. I didn't believe it either, but look here—imagine the sign: “DANGER: Neutrino Beam”! A U.S. edition is scheduled for publication at the end of September 2006. The author has operated the Not Even Wrong Web log since 2004; it is an excellent source for news and gossip on these issues. The unnamed “excitable … Harvard faculty member” mentioned on p. 227 and elsewhere is Luboš Motl (who is, however, named in the acknowledgements), and whose own Web log is always worth checking out.
Wednesday, June 7, 2006
2006-06-06: Switzerland Ready for Population Explosion, Antichrist
And that no man might buy or sell, save he that had the mark, or the name of the beast, or the number of his name.The Swiss parliament chose the auspicious sixth day of the sixth month of the sixth year of the present millennium to vote, 124 in favour vs. 45 opposed, to expand the AVS numbers issued to all residents of Switzerland to 13 digits from the previous 11. (AVS numbers, like Social Security numbers in the U.S., are nominally for social insurance programs but are, in fact, used as a national identity number for various other purposes.) The reason for the change is that the previous numbering scheme was so well designed that it is possible for more than one person to be assigned the same number and, starting at the end of 2007, the numbers will wrap around and it will be impossible to distinguish mature people 100 years and older from youth of 99 and fewer. Still, to anybody with a smattering of knowledge of information theory or the entropy of numeric data, a thirteen digit personal ID number is a pretty breathtaking concept, especially when you're living in a country with a population of about seven and a half million people. Were the new thirteen digit numbers assigned with complete efficiency and no redundancy (which, of course, they aren't, not even remotely—in fact the number encodes an individual's birth date, sex, initial letters of their family name and nationality, and certainly must include some kind of check digit and, one hopes, provision for disambiguation in the case of collision: consider twins) they could accommodate a Swiss population of ten trillion people: more than a thousand times the entire present-day world population. While Switzerland is one of the few European countries whose population is not forecast to shrink in the coming decades, even ten billion seems pretty far off demographic-wise.