2006  

January 2006

Hayward, Steven F. Greatness. New York: Crown Forum, 2005. ISBN 0-307-23715-X.
This book, subtitled “Reagan, Churchill, and the Making of Extraordinary Leaders ”, examines the parallels between the lives and careers of these two superficially very different men, in search of the roots of their ability, despite having both been underestimated and disdained by their contemporaries (which historical distance has caused many to forget in the case of Churchill, a fact of which Hayward usefully reminds the reader), and considered too old for the challenges facing them when they arrived at the summit of power.

The beginning of the Cold War was effectively proclaimed by Churchill's 1946 “Iron Curtain” speech in Fulton, Missouri, and its end foretold by Reagan's “Tear Down this Wall” speech at the Berlin wall in 1987. (Both speeches are worth reading in their entirety, as they have much more to say about the world of their times than the sound bites from them you usually hear.) Interestingly, both speeches were greeted with scorn, and much of Reagan's staff considered it fantasy to imagine and an embarrassment to suggest the Berlin wall falling in the foreseeable future.

Only one chapter of the book is devoted to the Cold War; the bulk explores the experiences which formed the character of these men, their self-education in the art of statecraft, their remarkably similar evolution from youthful liberalism in domestic policy to stalwart confrontation of external threats, and their ability to talk over the heads of the political class directly to the population and instill their own optimism when so many saw only disaster and decline ahead for their societies. Unlike the vast majority of their contemporaries, neither Churchill nor Reagan considered Communism as something permanent—both believed it would eventually collapse due to its own, shall we say, internal contradictions. This short book provides an excellent insight into how they came to that prophetic conclusion.

 Permalink

Bolchover, David. The Living Dead. Chichester, England: Capstone Publishing, 2005. ISBN 1-84112-656-X.
If you've ever worked in a large office, you may have occasionally found yourself musing, “Sure, I work hard enough, but what do all those other people do all day?” In this book, David Bolchover, whose personal work experience in two large U.K. insurance companies caused him to ask this question, investigates and comes to the conclusion, “Not very much”. Quoting statistics such as the fact that 70% of Internet pornography site accesses are during the 9 to 5 work day, and that fully one third of mid-week visitors at a large U.K. theme park are employees who called in sick at work, the author discovers that it is remarkably easy to hold down a white collar job in many large organisations while doing essentially no productive work at all—simply showing up every day and collecting paychecks. While the Internet has greatly expanded the scope of goofing off on the job (type “bored at work” into Google and you'll get in excess of sixteen million hits), it is in addition to traditional alternatives to work and, often, easier to measure. The author estimates that as many as 20% of the employees in large offices contribute essentially nothing to their employer's business—these are the “living dead” of the title. Not only are the employers of these people getting nothing for their salaries, even more tragically, the living dead themselves are wasting their entire working careers and a huge portion of their lives in numbing boredom devoid of the satisfaction of doing something worthwhile.

In large office environments, there is often so little direct visibility of productivity that somebody who either cannot do the work or simply prefers not to can fall into the cracks for an extended period of time—perhaps until retirement. The present office work environment can be thought of as a holdover from the factory jobs of the industrial revolution, but while it is immediately apparent if a machine operator or production line worker does nothing, this may not be evident for office work. (One of the reasons outsourcing may work well for companies is that it forces them to quantify the value of the contracted work, and the outsourcing companies are motivated to better measure the productivity of their staff since they represent a profit centre, as opposed to a cost centre for the company which outsources.)

Back during my blessedly brief career in the management of an organisation which grew beyond the experience base of those who founded it, I found that the only way I could get a sense for what was actually going on in the company, as opposed to what one heard in meetings and read in memoranda, was what I called “Lieutenant Columbo” management—walking around with a little notepad, sitting down with people all over the company, and asking them to explain what they really did—not what their job title said or what their department was supposed to accomplish, but how they actually spent the working day, which was often quite different from what you might have guessed. Another enlightening experience for senior management is to spend a day jacked in to the company switchboard, listening (only) to a sample of the calls coming in from the outside world. I guarantee that anybody who does this for a full working day will end up with pages of notes about things they had no idea were going on. (The same goes for product developers, who should regularly eavesdrop on customer support calls.) But as organisations become huge, the distance between management and where the work is actually done becomes so great that expedients like this cannot bridge the gap: hence the legions of living dead.

The insights in this book extend to why so many business books (some seeming like they were generated by the PowerPoint Content Wizard) are awful and what that says about the CEOs who read them, why mumbo-jumbo like “going forward, we need to grow the buy-in for leveraging our core competencies” passes for wisdom in the business world (while somebody who said something like that at the dinner table would, and should, invite a hail of cutlery and vegetables), and why so many middle managers (the indispensable NCOs of the corporate army) are so hideously bad.

I fear the author may be too sanguine about the prospects of devolving the office into a world of home-working contractors, all entrepreneurial and self-motivated. I wish that world could come into being, and I sincerely hope it does, but one worries that the inner-directed people who prosper in such an environment are the ones who are already productive even in the stultifying environment of today's office. Perhaps a “middle way” such as Jack Stack's Great Game of Business (September 2004), combined with the devolving of corporate monoliths into clusters of smaller organisations as suggested in this book may point the way to dezombifying the workplace.

If you follow this list, you know how few “business books” I read—as this book so eloquently explains, most are hideous. This is one which will open your eyes and make you think.

 Permalink

Ronson, Jon. Them: Adventures with Extremists. New York: Simon & Schuster, 2002. ISBN 0-7432-3321-2.
Journalist and filmmaker Jon Ronson, intrigued by political and religious extremists in modern Western societies, decided to try to get inside their heads by hanging out with a variety of them as they went about their day to day lives on the fringe. Despite his being Jewish, a frequent contributor to the leftist Guardian newspaper, and often thought of as primarily a humorist, he found himself welcomed into the inner circle of characters as diverse as U.K. Muslim fundamentalist Omar Bakri, Randy Weaver and his daughter Rachel, Colonel Bo Gritz, who he visits while helping to rebuild the Branch Davidian church at Waco, a Grand Wizard of the Ku Klux Klan attempting to remake the image of that organisation with the aid of self-help books, and Dr. Ian Paisley on a missionary visit to Cameroon (where he learns why it's a poor idea to order the “porcupine” in the restaurant when visiting that country).

Ronson is surprised to discover that, as incompatible as the doctrines of these characters may be, they are nearly unanimous in believing the world is secretly ruled by a conspiracy of globalist plutocrats who plot their schemes in shadowy venues such as the Bilderberg conferences and the Bohemian Grove in northern California. So, the author decides to check this out for himself. He stalks the secretive Bilderberg meeting to a luxury hotel in Portugal and discovers to his dismay that the Bilderberg Group stalks back, and that the British Embassy can't help you when they're on your tail. Then, he gatecrashes the bizarre owl god ritual in the Bohemian Grove through the clever expedient of walking in right through the main gate.

The narrative is entertaining throughout, and generally sympathetic to the extremists he encounters, who mostly come across as sincere (if deluded), and running small-time operations on a limited budget. After becoming embroiled in a controversy during a tour of Canada by David Icke, who claims the world is run by a cabal of twelve foot tall shape-shifting reptilians, and was accused of anti-Semitic hate speech on the grounds that these were “code words” for a Zionist conspiracy, the author ends up concluding that sometimes a twelve foot tall alien lizard is just an alien lizard.

 Permalink

Anderson, Brian C. South Park Conservatives. Washington: Regnery Publishing, 2005. ISBN 0-89526-019-0.
Who would have imagined that the advent of “new media”—not just the Internet, but also AM radio after having been freed of the shackles of the “fairness doctrine”, cable television, with its proliferation of channels and the advent of “narrowcasting”, along with the venerable old media of stand-up comedy, cartoon series, and square old books would end up being dominated by conservatives and libertarians? Certainly not the greybeards atop the media pyramid who believed they set the agenda for public discourse and are now aghast to discover that the “people power” they always gave lip service to means just that—the people, not they, actually have the power, and there's nothing they can do to get it back into their own hands.

This book chronicles the conservative new media revolution of the past decade. There's nothing about the new media in themselves which has made it a conservative revolution—it's simply that it occurred in a society in which, at the outset, the media were dominated by an elite which were in the thrall of a collectivist ideology which had little or no traction outside the imperial districts from which they declaimed, while the audience they were haranguing had different beliefs entirely which, when they found media which spoke to them, immediately started to listen and tuned out the well-groomed, dulcet-voiced, insipid propagandists of the conventional wisdom.

One need only glance at the cratering audience figures for the old media—left-wing urban newspapers, television network news, and “mainstream” news-magazines to see the extent to which they are being shunned. The audience abandoning them is discovering the new media: Web sites, blogs, cable news, talk radio, which (if one follows a broad enough selection), gives a sense of what is actually going on in the world, as opposed to what the editors of the New York Times and the Washington Post decide merits appearing on the front page.

Of course, the new media aren't perfect, but they are diverse—which is doubtless why collectivist partisans of coercive consensus so detest them. Some conservatives may be dismayed by the vulgarity of “South Park” (I'll confess; I'm a big fan), but we partisans of civilisation would be well advised to party down together under a broad and expansive tent. Otherwise, the bastards might kill Kenny with a rocket widget ball.

 Permalink

Dalrymple, Theodore. Our Culture, What's Left of It. Chicago: Ivan R. Dee, 2005. ISBN 1-56663-643-4.
Theodore Dalrymple is the nom de plume of Anthony Daniels, a British physician and psychiatrist who, until his recent retirement, practiced in a prison medical ward and public hospital in Birmingham, England. In his early career, he travelled widely, visiting such earthly paradises as North Korea, Afghanistan, Cuba, Zimbabwe (when it was still Rhodesia), and Tanzania, where he acquired an acute sense of the social prerequisites for the individual disempowerment which characterises the third world. This experience superbly equipped him to diagnose the same maladies in the city centres of contemporary Britain; he is arguably the most perceptive and certainly among the most eloquent contemporary observers of that society.

This book is a collection of his columns from City Journal, most dating from 2001 through 2004, about equally divided between “Arts and Letters” and “Society and Politics”. There are gems in both sections: you'll want to re-read Macbeth after reading Dalrymple on the nature of evil and need for boundaries if humans are not to act inhumanly. Among the chapters of social commentary is a prophetic essay which almost precisely forecast the recent violence in France three years before it happened, one of the clearest statements of the inherent problems of Islam in adapting to modernity, and a persuasive argument against drug legalisation by somebody who spent almost his entire career treating the victims of both illegal drugs and the drug war. Dalrymple has decided to conclude his medical career in down-spiralling urban Britain for a life in rural France where, notwithstanding problems, people still know how to live. Thankfully, he will continue his writing.

Many of these essays can be found on-line at the City Journal site; I've linked to those I cited in the last paragraph. I find that writing this fine is best enjoyed away from the computer, as ink on paper in a serene time, but it's great that one can now read material on-line to decide whether it's worth springing for the book.

 Permalink

Young, Michael. The Rise of the Meritocracy. New Brunswick, NJ: Transaction Publishers, [1958] 1994. ISBN 1-56000-704-4.
The word “meritocracy” has become so commonplace in discussions of modern competitive organisations and societies that you may be surprised to learn the word did not exist before 1958—a year after Sputnik—when the publication of this most curious book introduced the word and concept into the English language. This is one of the oddest works of serious social commentary ever written—so odd, in fact, its author despaired of its ever seeing print after the manuscript was rejected by eleven publishers before finally appearing, whereupon it was quickly republished by Penguin and has been in print ever since, selling hundreds of thousands of copies and being translated into seven different languages.

Even though the author was a quintessential “policy wonk”: he wrote the first postwar manifesto for the British Labour Party, founded the Open University and the Consumer Association, and sat in the House of Lords as Lord Young of Dartington, this is a work of…what shall we call it…utopia? dystopia? future history? alternative history? satire? ironic social commentary? science fiction?…beats me. It has also perplexed many others, including one of the publishers who rejected it on the grounds that “they never published Ph.D. theses” without having observed that the book is cast as a thesis written in the year 2034! Young's dry irony and understated humour has gone right past many readers, especially those unacquainted with English satire, moving them to outrage, as if George Orwell were thought to be advocating Big Brother. (I am well attuned to this phenomenon, having experienced it myself with the Unicard and Digital Imprimatur papers; no matter how obvious you make the irony, somebody, usually in what passes for universities these days, will take it seriously and explode in rage and vituperation.)

The meritocracy of this book is nothing like what politicians and business leaders mean when they parrot the word today (one hopes, anyway)! In the future envisioned here, psychology and the social sciences advance to the point that it becomes possible to determine the IQ of individuals at a young age, and that this IQ, combined with motivation and effort of the person, is an almost perfect predictor of their potential achievement in intellectual work. Given this, Britain is seen evolving from a class system based on heredity and inherited wealth to a caste system sorted by intelligence, with the high-intelligence élite “streamed” through special state schools with their peers, while the lesser endowed are directed toward manual labour, and the sorry side of the bell curve find employment as personal servants to the élite, sparing their precious time for the life of the mind and the leisure and recreation it requires.

And yet the meritocracy is a thoroughly socialist society: the crème de la crème become the wise civil servants who direct the deployment of scarce human and financial capital to the needs of the nation in a highly-competitive global environment. Inheritance of wealth has been completely abolished, existing accumulations of wealth confiscated by “capital levies”, and all salaries made equal (although the élite, naturally, benefit from a wide variety of employer-provided perquisites—so is it always, even in merito-egalitopias). The benevolent state provides special schools for the intelligent progeny of working class parents, to rescue them from the intellectual damage their dull families might do, and prepare them for their shining destiny, while at the same time it provides sports, recreation, and entertainment to amuse the mentally modest masses when they finish their daily (yet satisfying, to dullards such as they) toil.

Young's meritocracy is a society where equality of opportunity has completely triumphed: test scores trump breeding, money, connections, seniority, ethnicity, accent, religion, and all of the other ways in which earlier societies sorted people into classes. The result, inevitably, is drastic inequality of results—but, hey, everybody gets paid the same, so it's cool, right? Well, for a while anyway…. As anybody who isn't afraid to look at the data knows perfectly well, there is a strong hereditary component to intelligence. Sorting people into social classes by intelligence will, over the generations, cause the mean intelligence of the largely non-interbreeding classes to drift apart (although there will be regression to the mean among outliers on each side, mobility among the classes due to individual variation will preserve or widen the gap). After a few generations this will result, despite perfect social mobility in theory, in a segregated caste system almost as rigid as that of England at the apogee of aristocracy. Just because “the masses” actually are benighted in this society doesn't mean they can't cause a lot of trouble, especially if incited by rabble-rousing bored women from the élite class. (I warned you this book will enrage those who don't see the irony.) Toward the end of the book, this conflict is building toward a crisis. Anybody who can guess the ending ought to be writing satirical future history themselves.

Actually, I wonder how many of those who missed the satire didn't actually finish the book or simply judged it by the title. It is difficult to read a passage like this one on p. 134 and mistake it for anything else.

Contrast the present — think how different was a meeting in the 2020s of the National Joint Council, which has been retained for form's sake. On the one side sit the I.Q.s of 140, on the other the I.Q.s of 99. On the one side the intellectual magnates of our day, on the other honest, horny-handed workmen more at home with dusters than documents. On the one side the solid confidence born of hard-won achievement; on the other the consciousness of a just inferiority.
Seriously, anybody who doesn't see the satire in this must be none too Swift. Although the book is cast as a retrospective from 2038, and there passing references to atomic stations, home entertainment centres, school trips to the Moon and the like, technologically the world seems very much like that of 1950s. There is one truly frightening innovation, however. On p. 110, discussing the shrinking job market for shop attendants, we're told, “The large shop with its more economical use of staff had supplanted many smaller ones, the speedy spread of self-service in something like its modern form had reduced the number of assistants needed, and piped distribution of milk, tea, and beer was extending rapidly.” To anybody with personal experience with British plumbing and English beer, the mere thought of the latter being delivered through the former is enough to induce dystopic shivers of 1984 magnitude.

Looking backward from almost fifty years on, this book can be read as an alternative history of the last half-century. In the eyes of many with a libertarian or conservative inclination, just when the centuries-long battle against privilege and prejudice was finally being won: in the 1950s and early 60s when Young's book appeared, the dream of equal opportunity so eloquently embodied in Dr. Martin Luther King's “I Have a Dream” speech began to evaporate in favour of equality of results (by forced levelling and dumbing down if that's what it took), group identity and entitlements, and the creation of a permanently dependent underclass from which escape was virtually impossible. The best works of alternative history are those which change just one thing in the past and then let the ripples spread outward over the years. You can read this story as a possible future in which equal opportunity really did completely triumph over egalitarianism in the sixties. For those who assume that would have been an unqualifiedly good thing, here is a cautionary tale well worth some serious reflexion.

 Permalink

February 2006

Randall, Lisa. Warped Passages. New York: Ecco, 2005. ISBN 0-06-053108-8.
The author is one of most prominent theoretical physicists working today, known primarily for her work on multi-dimensional “braneworld” models for particle physics and gravitation. With Raman Sundrum, she created the Randall-Sundrum models, the papers describing which are among the most highly cited in contemporary physics. In this book, aimed at a popular audience, she explores the revolution in theoretical physics which extra dimensional models have sparked since 1999, finally uniting string theorists, model builders, and experimenters in the expectation of finding signatures of new physics when the Large Hadron Collider (LHC) comes on stream at CERN in 2007.

The excitement among physicists is palpable: there is now reason to believe that the unification of all the forces of physics, including gravity, may not lie forever out of reach at the Planck energy, but somewhere in the TeV range—which will be accessible at the LHC. This book attempts to communicate that excitement to the intelligent layman and, sadly, falls somewhat short of the mark. The problem, in a nutshell, is that while the author is a formidable physicist, she is not, at least at this point in her career, a particularly talented populariser of science. In this book she has undertaken an extremely ambitious task, since laying the groundwork for braneworld models requires recapitulating most of twentieth century physics, including special and general relativity, quantum mechanics, particle physics and the standard model, and the rudiments of string theory. All of this results in a 500 page volume where we don't really get to the new stuff until about page 300. Now, this problem is generic to physics popularisations, but many others have handled it much better; Randall seems compelled to invent an off-the-wall analogy for every single technical item she describes, even when the description itself would be crystal clear to a reader encountering the material for the first time. You almost start to cringe—after every paragraph or two about actual physics, you know there's one coming about water sprinklers, ducks on a pond, bureaucrats shuffling paper, artists mixing paint, drivers and speed traps, and a host of others. There are also far too few illustrations in the chapters describing relativity and quantum mechanics; Isaac Asimov used to consider it a matter of pride to explain things in words rather than using a diagram, but Randall is (as yet) neither the wordsmith nor the explainer that Asimov was, but then who is?

There is a lot to like here, and I know of no other popular source which so clearly explains what may be discovered when the LHC fires up next year. Readers familiar with modern physics might check this book out of the library or borrow a copy from a friend and start reading at chapter 15, or maybe chapter 12 if you aren't up on the hierarchy problem in the standard model. This is a book which could have greatly benefited from a co-author with experience in science popularisation: Randall's technical writing (for example, her chapter in the Wheeler 90th birthday festschrift) is a model of clarity and concision; perhaps with more experience she'll get a better handle on communicating to a general audience.

 Permalink

Warraq, Ibn [pseud.] ed. Leaving Islam. Amherst, NY: Prometheus Books, 2003. ISBN 1-59102-068-9.
Multiculturalists and ardent secularists may contend “all organised religions are the same”, but among all major world religions only Islam prescribes the death penalty for apostasy, which makes these accounts by former Muslims of the reasons for and experience of their abandoning Islam more than just stories of religious doubt. (There is some dispute as to whether the Koran requires death for apostates, or only threatens punishment in the afterlife. Some prominent Islamic authorities, however, interpret surat II:217 and IX:11,12 as requiring death for apostates. Numerous aḥadīth are unambiguous on the point, for example Bukhārī book 84, number 57 quotes Mohammed saying, “Whoever changed his Islamic religion, then kill him”, which doesn't leave a lot of room for interpretation, nor do authoritative manuals of Islamic law such as Reliance of the Traveller, which prescribes (o8.1) “When a person who has reached puberty and is sane voluntarily apostasizes from Islam, he deserves to be killed”. The first hundred pages of Leaving Islam explore the theory and practice of Islamic apostasy in both ancient and modern times.)

The balance of the book are personal accounts by apostates, both those born into Islam and converts who came to regret their embrace of what Salman Rushdie has called “that least huggable of faiths”. These testaments range from the tragic (chapter 15), to the philosophical (chapter 29), and ironically humorous (chapter 37). One common thread which runs through the stories of many apostates is that while they were taught as children to “read” the Koran, what this actually meant was learning enough Arabic script and pronunciation to be able to recite the Arabic text but without having any idea what it meant. (Very few of the contributors to this book speak Arabic as their mother tongue, and it is claimed [p. 400] that even native Arabic speakers can barely understand the classical Arabic of the Koran, but I don't know the extent to which this is true. But in any case, only about 15% of Muslims are Arabic mother tongue speakers.) In many of the narratives, disaffection with Islam either began, or was strongly reinforced, when they read the Koran in translation and discovered that the “real Islam” they had imagined as idealistic and benign was, on the evidence of what is regarded as the word of God, nothing of the sort. It is interesting that, unlike the Roman Catholic church before the Reformation, which attempted to prevent non-clergy from reading the Bible for themselves, Islam encourages believers to study the Koran and Ḥadīth, both in the original Arabic and translation (see for example this official Saudi site). It is ironic that just such study of scripture seems to encourage apostasy, but perhaps this is the case only for those already so predisposed.

Eighty pages of appendices include quotations from the Koran and Ḥadīth illustrating the darker side of Islam and a bibliography of books and list of Web sites critical of Islam. The editor is author of Why I Am Not a Muslim (February 2002), editor of What the Koran Really Says (April 2003), and founder of the Institute for the Secularisation of Islamic Society.

 Permalink

Gurstelle, William. Adventures from the Technology Underground. New York: Clarkson Potter, 2006. ISBN 1-4000-5082-0.
This thoroughly delightful book invites the reader into a subculture of adults who devote their free time, disposable income, and considerable brainpower to defying Mr. Wizard's sage injunction, “Don't try this yourself at home”. The author begins with a handy litmus test to decide whether you're a candidate for the Technology Underground. If you think flying cars are a silly gag from The Jetsons, you don't make the cut. If, on the other hand, you not only think flying cars are perfectly reasonable but can barely comprehend why there isn't already one, ideally with orbital capability, in your own garage right now—it's the bleepin' twenty-first century, fervent snakes—then you “get it” and will have no difficulty understanding what motivates folks to build high powered rockets, giant Tesla coils, flamethrowers, hypersonic rail guns, hundred foot long pumpkin-firing cannons, and trebuchets (if you really want to make your car fly, it's just the ticket, but the operative word is “fly”, not “land”). In a world where basement tinkering and “that looks about right” amateur engineering has been largely supplanted by virtual and vicarious experiences mediated by computers, there remains the visceral attraction of heavy metal, high voltage, volatile chemicals, high velocities, and things that go bang, whoosh, zap, splat, and occasionally kaboom.

A technical section explains the theory and operation of the principal engine of entertainment in each chapter. The author does not shrink from using equations where useful to clarify design trade-offs; flying car fans aren't going to be intimidated by the occasional resonant transformer equation! The principles of operation of the various machines are illustrated by line drawings, but there isn't a single photo in the book, which is a real shame. Three story tall diesel-powered centrifugal pumpkin hurling machines, a four story 130 kW Tesla coil, and a calliope with a voice consisting of seventeen pulsejets are something one would like to see as well as read about, however artfully described.

 Permalink

Mullane, Mike. Riding Rockets. New York: Scribner, 2006. ISBN 0-7432-7682-5.
Mike Mullane joined NASA in 1978, one of the first group of astronauts recruited specifically for the space shuttle program. An Air Force veteran of 134 combat missions in Vietnam as back-seater in the RF-4C reconnaissance version of the Phantom fighter (imperfect eyesight disqualified him from pilot training), he joined NASA as a mission specialist and eventually flew on three shuttle missions: STS-41D in 1984, STS-27 in 1988, and STS-36 in 1990, the latter two classified Department of Defense missions for which he was twice awarded the National Intelligence Medal of Achievement. (Receipt of this medal was, at the time, itself a secret, but was declassified after the collapse of the Soviet Union. The work for which the medals were awarded remains secret to this day.)

As a mission specialist, Mullane never maneuvered the shuttle in space nor landed it on Earth, nor did he perform a spacewalk, mark any significant “first” in space exploration or establish any records apart from being part of the crew of STS-36 which flew the highest inclination (62°) orbit of any human spaceflight so far. What he has done here is write one of the most enlightening, enthralling, and brutally honest astronaut memoirs ever published, far and away the best describing the shuttle era. All of the realities of NASA in the 1980s which were airbrushed out by Public Affairs Officers with the complicity of an astronaut corps who knew that to speak to an outsider about what was really going on would mean they'd never get another flight assignment are dealt with head-on: the dysfunctional, intimidation- and uncertainty-based management culture, the gap between what astronauts knew about the danger and unreliability of the shuttle and what NASA was telling Congress and public, the conflict between battle-hardened military astronauts and perpetual student post-docs recruited as scientist-astronauts, the shameless toadying to politicians, and the perennial over-promising of shuttle capabilities and consequent corner-cutting and workforce exhaustion. (Those of a libertarian bent might wish they could warp back in time, shake the author by the shoulders, and remind him, “Hey dude, you're working for a government agency!”)

The realities of flying a space shuttle mission are described without any of the sugar-coating or veiled references common in other astronaut accounts, and always with a sense of humour. The deep-seated dread of strapping into an experimental vehicle with four million pounds of explosive fuel and no crew escape system is discussed candidly, along with the fact that, while universally shared by astronauts, it was, of course, never hinted to outsiders, even passengers on the shuttle who were told it was a kind of very fast, high-flying airliner. Even if the shuttle doesn't kill you, there's still the toilet to deal with, and any curiosity you've had about that particular apparatus will not outlast your finishing this book (the on-orbit gross-out prank on p. 179 may be too much even for “South Park”). Barfing in space and the curious and little-discussed effects of microgravity on the male and female anatomy which may someday contribute mightily to the popularity of orbital tourism are discussed in graphic detail. A glossary of NASA jargon and acronyms is included but there is no index, which would be a valuable addition.

 Permalink

Kelleher, Colm A. and George Knapp. Hunt for the Skinwalker. New York: Paraview Pocket Books, 2005. ISBN 1-4165-0521-0.
Memo to file: if you're one of those high-strung people prone to be rattled by the occasional bulletproof wolf, flying refrigerator, disappearing/reappearing interdimensional gateway, lumbering giant humanoid, dog-incinerating luminous orb, teleporting bull, and bloodlessly eviscerated cow, don't buy a ranch, even if it's a terrific bargain, whose very mention makes American Indians in the neighbourhood go “woo-woo” and slowly back away from you. That's what Terry Sherman (“Tom Gorman” in this book) and family did in 1994, walking into, if you believe their story, a seething nexus of the paranormal so weird and intense that Chris Carter could have saved a fortune by turning the “X-Files” into a reality show about their life. The Shermans found that living with things which don't just go bump in the night but also slaughter their prize livestock and working dogs so disturbing they jumped at the opportunity to unload the place in 1996, when the National Institute for Discovery Science (NIDS), a private foundation investigating the paranormal funded by real estate tycoon and inflatable space station entrepreneur Robert Bigelow offered to buy them out in order to establish a systematic on-site investigation of the phenomena. (The NIDS Web site does not appear to have been updated since late 2004; I don't know if the organisation is still in existence or active.)

This book, co-authored by the biochemist who headed the field team investigating the phenomena and the television news reporter who covered the story, describes events on the ranch both before and during the scientific investigation. As is usual in such accounts, all the really weird stuff happened before the scientists arrived on the scene with their cameras, night vision scopes, radiation meters, spectrometers, magnetometers (why is always magnetometers, anyway?) and set up shop in their “command and control centre” (a.k.a. trailer—summoning to mind the VW bus “mobile command post” in The Lone Gunmen). Afterward, there was only the rare nocturnal light, mind-controlling black-on-black flying object, and transdimensional tunnel sighting (is an orange pulsating luminous orb which disgorges fierce four hundred pound monsters a “jackal lantern”?), none, of course, captured on film or video, nor registered on any other instrument.

This observation and investigation serves as the launch pad for eighty pages of speculation about causes, natural and supernatural, including the military, shape-shifting Navajo witches, extraterrestrials, invaders from other dimensions, hallucination-inducing shamanism, bigfoot, and a muddled epilogue which illustrates why biochemists and television newsmen should seek the advice of a physicist before writing about speculative concepts in modern physics. The conclusion is, unsurprisingly: “inconclusive.”

Suppose, for a moment, that all of this stuff really did happen, more or less as described. (Granted, that is a pretty big hypothetical, but then the family who first experienced the weirdness never seems to have sought publicity or profit from their experiences, and this book is the first commercial exploitation of the events, coming more than ten years after they began.) What could possibly be going on? Allow me to humbly suggest that the tongue-in-cheek hypothesis advanced in my 1997 paper Flying Saucers Explained, combined with some kind of recurring “branestorm” opening and closing interdimensional gates in the vicinity, might explain many of the otherwise enigmatic, seemingly unrelated, and nonsensical phenomena reported in this and other paranormal “hot spots”.

 Permalink

March 2006

Ferrigno, Robert. Prayers for the Assassin. New York: Scribner, 2006. ISBN 0-7432-7289-7.
The year is 2040. The former United States have fissioned into the coast-to-coast Islamic Republic in the north and the Bible Belt from Texas eastward to the Atlantic, with the anything-goes Nevada Free State acting as a broker between them, pressure relief valve, and window to the outside world. The collapse of the old decadent order was triggered by the nuclear destruction of New York and Washington, and the radioactive poisoning of Mecca by a dirty bomb in 2015, confessed to by an agent of the Mossad, who revealed a plot to set the Islamic world and the West against one another. In the aftermath, a wave of Islamic conversion swept the West, led by the glitterati and opinion leaders, with hold-outs fleeing to the Bible Belt, which co-exists with the Islamic Republic in a state of low intensity warfare. China has become the world's sole superpower, with Russia, reaping the benefit of refugees from overrun Israel, the high-technology centre.

This novel is set in the Islamic Republic, largely in the capital of Seattle (no surprise—even pre-transition, that's where the airheads seem to accrete, and whence bad ideas and flawed technologies seep out to despoil the heartland). The society sketched is believably rich and ambiguous: Muslims are divided into “modern”, “moderate”, and “fundamentalist” communities which more or less co-exist, like the secular, religious, and orthodox communities in present-day Israel. Many Catholics have remained in the Islamic Republic, reduced to dhimmitude and limited in their career aspirations, but largely left alone as long as they keep to themselves. The Southwest, with its largely Catholic hispanic population, is a zone of relative personal liberty within the Islamic Republic, much like Kish Island in Iran. Power in the Islamic Republic, as in Iran, is under constant contention among national security, religious police, the military, fanatic “fedayeen”, and civil authority, whose scheming against one another leaves cracks in which the clever can find a modicum of freedom.

But the historical events upon which the Islamic Republic is founded may not be what they seem, and the protagonists, the adopted but estranged son and daughter of the shadowy head of state security, must untangle decades of intrigue and misdirection to find the truth and make it public. There are some thoughtful and authentic touches in the world sketched in this novel: San Francisco has become a hotbed of extremist fundamentalism, which might seem odd until you reflect that moonbat collectivism and environmentalism share much of the same desire to make the individual submit to externally imposed virtue which suffuses radical Islam. Properly packaged and marketed, Islam can be highly attractive to disillusioned leftists, as the conversion of Carlos “the Jackal” from fanatic Marxist to “revolutionary Islam” demonstrates.

There are a few goofs. Authors who include nuclear weapons in their stories really ought seek the advice of somebody who knows about them, or at least research them in the Nuclear Weapons FAQ. The “fissionable fuel rods from a new Tajik reactor…made from a rare isotope, supposedly much more powerful than plutonium” on p. 212, purportedly used to fabricate a five megaton bomb, is the purest nonsense in about every way imaginable. First of all, there are no isotopes, rare or otherwise, which are better than highly enriched uranium (HEU) or plutonium for fission weapons. Second, there's no way you could possibly make a five megaton fission bomb, regardless of the isotope you used—to get such a yield you'd need so much fission fuel that it would be much more than a critical mass and predetonate, which would ruin your whole day. The highest yield fission bomb ever built was Ted Taylor's Mk 18F Super Oralloy Bomb (SOB), which contained about four critical masses of U-235, and depended upon the very low neutron background of HEU to permit implosion assembly before predetonation. The SOB had a yield of about 500 kt; with all the short half-life junk in fuel rods, there's no way you could possibly approach that yield, not to speak of something ten times as great. If you need high yield, tritium boosting or a full-fledged two stage Teller-Ulam fusion design is the only way to go. The author also shares the common misconception in thrillers that radiation is something like an infectuous disease which permanently contaminates everything it touches. Unfortunately, this fallacy plays a significant part in the story.

Still, this is a well-crafted page-turner which, like the best alternative history, is not only entertaining but will make you think. The blogosphere has been chattering about this book (that's where I came across it), and they're justified in recommending it. The Web site for the book, complete with Flash animation and an annoying sound track, includes background information and the author's own blog with links to various reviews.

 Permalink

Susskind, Leonard. The Cosmic Landscape. New York: Little, Brown, 2006. ISBN 0-316-15579-9.
Leonard Susskind (and, independently, Yoichiro Nambu) co-discovered the original hadronic string theory in 1969. He has been a prominent contributor to a wide variety of topics in theoretical physics over his long career, and is a talented explainer of abstract theoretical concepts to the general reader. This book communicates both the physics and cosmology of the “string landscape” (a term he coined in 2003) revolution which has swiftly become the consensus among string theorists, as well as the intellectual excitement of those exploring this new frontier.

The book is subtitled “String Theory and the Illusion of Intelligent Design” which may be better marketing copy—controversy sells—than descriptive of the contents. There is very little explicit discussion of intelligent design in the book at all except in the first and last pages, and what is meant by “intelligent design” is not what the reader might expect: design arguments in the origin and evolution of life, but rather the apparent fine-tuning of the physical constants of our universe, the cosmological constant in particular, without which life as we know it (and, in many cases, not just life but even atoms, stars, and galaxies) could not exist. Susskind is eloquent in describing why the discovery that the cosmological constant, which virtually every theoretical physicist would have bet had to be precisely zero, is (apparently) a small tiny positive number, seemingly fine tuned to one hundred and twenty decimal places “hit us like the proverbial ton of bricks” (p. 185)—here was a number which, not only did theory suggest should be 120 orders of magnitude greater, but which, had it been slightly larger than its minuscule value, would have precluded structure formation (and hence life) in the universe. One can imagine some as-yet-undiscovered mathematical explanation why a value is precisely zero (and, indeed, physicists did: it's called supersymmetry, and searching for evidence of it is one of the reasons they're spending billions of taxpayer funds to build the Large Hadron Collider), but when you come across a dial set with the almost ridiculous precision of 120 decimal places and it's a requirement for our own existence, thoughts of a benevolent Creator tend to creep into the mind of even the most doctrinaire scientific secularist. This is how the appearance of “intelligent design” (as the author defines it) threatens to get into the act, and the book is an exposition of the argument string theorists and cosmologists have developed to contend that such apparent design is entirely an illusion.

The very title of the book, then invites us to contrast two theories of the origin of the universe: “intelligent design” and the “string landscape”. So, let's accept that challenge and plunge right in, shall we? First of all, permit me to observe that despite frequent claims to the contrary, including some in this book, intelligent design need not presuppose a supernatural being operating outside the laws of science and/or inaccessible to discovery through scientific investigation. The origin of life on Earth due to deliberate seeding with engineered organisms by intelligent extraterrestrials is a theory of intelligent design which has no supernatural component, evidence of which may be discovered by science in the future, and which is sufficiently plausible to have persuaded Francis Crick, co-discoverer of the structure of DNA, was the most likely explanation. If you observe a watch, you're entitled to infer the existence of a watchmaker, but there's no reason to believe he's a magician, just a craftsman.

If we're to compare these theories, let us begin by stating them both succinctly:

Theory 1: Intelligent Design.   An intelligent being created the universe and chose the initial conditions and physical laws so as to permit the existence of beings like ourselves.

Theory 2: String Landscape.   The laws of physics and initial conditions of the universe are chosen at random from among 10500 possibilities, only a vanishingly small fraction of which (probably no more than one in 10120) can support life. The universe we observe, which is infinite in extent and may contain regions where the laws of physics differ, is one of an infinite number of causally disconnected “pocket universes“ which spontaneously form from quantum fluctuations in the vacuum of parent universes, a process which has been occurring for an infinite time in the past and will continue in the future, time without end. Each of these pocket universes which, together, make up the “megaverse”, has its own randomly selected laws of physics, and hence the overwhelming majority are sterile. We find ourselves in one of the tiny fraction of hospitable universes because if we weren't in such an exceptionally rare universe, we wouldn't exist to make the observation. Since there are an infinite number of universes, however, every possibility not only occurs, but occurs an infinite number of times, so not only are there an infinite number of inhabited universes, there are an infinite number identical to ours, including an infinity of identical copies of yourself wondering if this paragraph will ever end. Not only does the megaverse spawn an infinity of universes, each universe itself splits into two copies every time a quantum measurement occurs. Our own universe will eventually spawn a bubble which will destroy all life within it, probably not for a long, long time, but you never know. Evidence for all of the other universes is hidden behind a cosmic horizon and may remain forever inaccessible to observation.

Paging Friar Ockham! If unnecessarily multiplying hypotheses are stubble indicating a fuzzy theory, it's pretty clear which of these is in need of the razor! Further, while one can imagine scientific investigation discovering evidence for Theory 1, almost all of the mechanisms which underlie Theory 2 remain, barring some conceptual breakthrough equivalent to looking inside a black hole, forever hidden from science by an impenetrable horizon through which no causal influence can propagate. So severe is this problem that chapter 9 of the book is devoted to the question of how far theoretical physics can go in the total absence of experimental evidence. What's more, unlike virtually every theory in the history of science, which attempted to describe the world we observe as accurately and uniquely as possible, Theory 2 predicts every conceivable universe and says, hey, since we do, after all, inhabit a conceivable universe, it's consistent with the theory. To one accustomed to the crystalline inevitability of Newtonian gravitation, general relativity, quantum electrodynamics, or the laws of thermodynamics, this seems by comparison like a California blonde saying “whatever”—the cosmology of despair.

Scientists will, of course, immediately rush to attack Theory 1, arguing that a being such as that it posits would necessarily be “indistinguishable from magic”, capable of explaining anything, and hence unfalsifiable and beyond the purview of science. (Although note that on pp. 192–197 Susskind argues that Popperian falsifiability should not be a rigid requirement for a theory to be deemed scientific. See Lee Smolin's Scientific Alternatives to the Anthropic Principle for the argument against the string landscape theory on the grounds of falsifiability, and the 2004 Smolin/Susskind debate for a more detailed discussion of this question.) But let us look more deeply at the attributes of what might be called the First Cause of Theory 2. It not only permeates all of our universe, potentially spawning a bubble which may destroy it and replace it with something different, it pervades the abstract landscape of all possible universes, populating them with an infinity of independent and diverse universes over an eternity of time: omnipresent in spacetime. When a universe is created, all the parameters which ultimately govern its ultimate evolution (under the probabilistic laws of quantum mechanics, to be sure) are fixed at the moment of creation: omnipotent to create any possibility, perhaps even varying the mathematical structures underlying the laws of physics. As a budded off universe evolves, whether a sterile formless void or teeming with intelligent life, no information is ever lost in its quantum evolution, not even down a black hole or across a cosmic horizon, and every quantum event splits the universe and preserves all possible outcomes. The ensemble of universes is thus omniscient of all its contents. Throw in intelligent and benevolent, and you've got the typical deity, and since you can't observe the parallel universes where the action takes place, you pretty much have to take it on faith. Where have we heard that before?

Lest I be accused of taking a cheap shot at string theory, or advocating a deistic view of the universe, consider the following creation story which, after John A. Wheeler, I shall call “Creation without the Creator”. Many extrapolations of continued exponential growth in computing power envision a technological singularity in which super-intelligent computers designing their own successors rapidly approach the ultimate physical limits on computation. Such computers would be sufficiently powerful to run highly faithful simulations of complex worlds, including intelligent beings living within them which need not be aware they were inhabiting a simulation, but thought they were living at the “top level”, who eventually passed through their own technological singularity, created their own simulated universes, populated them with intelligent beings who, in turn,…world without end. Of course, each level of simulation imposes a speed penalty (though, perhaps not much in the case of quantum computation), but it's not apparent to the inhabitants of the simulation since their own perceived time scale is in units of the “clock rate” of the simulation.

If an intelligent civilisation develops to the point where it can build these simulated universes, will it do so? Of course it will—just look at the fascination crude video game simulations have for people today. Now imagine a simulation as rich as reality and unpredictable as tomorrow, actually creating an inhabited universe—who could resist? As unlimited computing power becomes commonplace, kids will create innovative universes and evolve them for billions of simulated years for science fair projects. Call the mean number of simulated universes created by intelligent civilisations in a given universe (whether top-level or itself simulated) the branching factor. If this is greater than one, and there is a single top-level non-simulated universe, then it will be outnumbered by simulated universes which grow exponentially in numbers with the depth of the simulation. Hence, by the Copernican principle, or principle of mediocrity, we should expect to find ourselves in a simulated universe, since they vastly outnumber the single top-level one, which would be an exceptional place in the ensemble of real and simulated universes. Now here's the point: if, as we should expect from this argument, we do live in a simulated universe, then our universe is the product of intelligent design and Theory 1 is an absolutely correct description of its origin.

Suppose this is the case: we're inside a simulation designed by a freckle-faced superkid for extra credit in her fifth grade science class. Is this something we could discover, or must it, like so many aspects of Theory 2, be forever hidden from our scientific investigation? Surprisingly, this variety of Theory 1 is quite amenable to experiment: neither revelation nor faith is required. What would we expect to see if we inhabited a simulation? Well, there would probably be a discrete time step and granularity in position fixed by the time and position resolution of the simulation—check, and check: the Planck time and distance appear to behave this way in our universe. There would probably be an absolute speed limit to constrain the extent we could directly explore and impose a locality constraint on propagating updates throughout the simulation—check: speed of light. There would be a limit on the extent of the universe we could observe—check: the Hubble radius is an absolute horizon we cannot penetrate, and the last scattering surface of the cosmic background radiation limits electromagnetic observation to a still smaller radius. There would be a limit on the accuracy of physical measurements due to the finite precision of the computation in the simulation—check: Heisenberg uncertainty principle—and, as in games, randomness would be used as a fudge when precision limits were hit—check: quantum mechanics.

Might we expect surprises as we subject our simulated universe to ever more precise scrutiny, perhaps even astonishing the being which programmed it with our cunning and deviousness (as the author of any software package has experienced at the hands of real-world users)? Who knows, we might run into round-off errors which “hit us like a ton of bricks”! Suppose there were some quantity, say, that was supposed to be exactly zero but, if you went and actually measured the geometry way out there near the edge and crunched the numbers, you found out it differed from zero in the 120th decimal place. Why, you might be as shocked as the naïve Perl programmer who ran the program “printf("%.18f", 0.2)” and was aghast when it printed “0.200000000000000011” until somebody explained that with about 56 bits of mantissa in IEEE double precision floating point, you only get about 17 decimal digits (log10 256) of precision. So, what does a round-off in the 120th digit imply? Not Theory 2, with its infinite number of infinitely reproducing infinite universes, but simply that our Theory 1 intelligent designer used 400 bit numbers (log2 10120) in the simulation and didn't count on our noticing—remember you heard it here first, and if pointing this out causes the simulation to be turned off, sorry about that, folks! Surprises from future experiments which would be suggestive (though not probative) that we're in a simulated universe would include failure to find any experimental signature of quantum gravity (general relativity could be classical in the simulation, since potential conflicts with quantum mechanics would be hidden behind event horizons in the present-day universe, and extrapolating backward to the big bang would be meaningless if the simulation were started at a later stage, say at the time of big bang nucleosynthesis), and discovery of limits on the ability to superpose wave functions for quantum computation which could result from limited precision in the simulation as opposed to the continuous complex values assumed by quantum mechanics. An interesting theoretical program would be to investigate feasible experiments which, by magnifying physical effects similar to proposed searches for quantum gravity signals, would detect round-off errors of magnitude comparable to the cosmological constant.

But seriously, this is an excellent book and anybody who's interested in the strange direction in which the string theorists are veering these days ought to read it; it's well-written, authoritative, reasonably fair to opposing viewpoints (although I'm surprised the author didn't address the background spacetime criticism of string theory raised so eloquently by Lee Smolin), and provides a roadmap of how string theory may develop in the coming years. The only nagging question you're left with after finishing the book is whether after thirty years of theorising which comes to the conclusion that everything is predicted and nothing can be observed, it's about science any more.

 Permalink

Freeh, Louis J. with Howard Means. My FBI. New York: St. Martin's Press, 2005. ISBN 0-312-32189-9.
This may be one of the most sanctimonious and self-congratulatory books ever written by a major U.S. public figure who is not Jimmy Carter. Not only is the book titled “My FBI” (gee, I always thought it was supposed to belong to the U.S. taxpayers who pay the G-men's salaries and buy the ammunition they expend), in the preface, where the author explains why he reversed his original decision not to write a memoir of his time at the FBI, he uses the words “I”, “me”, “my”, and “myself” a total of 91 times in four pages.

Only about half of the book covers Freeh's 1993–2001 tenure as FBI director; the rest is a straightforward autohagiography of his years as an altar boy, Eagle Scout, idealistic but apolitical law student during the turbulent early 1970s, FBI agent, crusading anti-Mafia federal prosecutor in New York City, and hard-working U.S. district judge, before bring appointed to the FBI job by Bill Clinton, who promised him independence and freedom from political interference in the work of the Bureau. Little did Freeh expect, when accepting the job, that he would spend much of his time in the coming years investigating the Clintons and their cronies. The tawdry and occasionally bizarre stories of those events as seen from the FBI fills a chapter and sets the background for the tense relations between the White House and FBI on other matters such as terrorism and counter-intelligence. The Oklahoma City and Saudi Arabian Khobar Towers bombings, the Atlanta Olympics bomb, the identification and arrest of Unabomber Ted Kaczynski, and the discovery of long-term Soviet mole Robert Hanssen in the FBI all occurred on Freeh's watch; he provides a view of these events and the governmental turf battles they engendered from the perspective of the big office in the Hoover Building, but there's little or no new information about the events themselves. Freeh resigned the FBI directorship in June 2001, and September 11th of that year was the first day at his new job. (What do you do after nine years running the FBI? Go to work for a credit card company!) In a final chapter, he provides a largely exculpatory account of the FBI's involvement in counter-terrorism and what might have been done to prevent such terrorist strikes. He directly attacks Richard A. Clarke and his book Against All Enemies as a self-aggrandising account by a minor player including some outright fabrications.

Freeh's book provides a peek into the mind of a self-consciously virtuous top cop—if only those foolish politicians and their paranoid constituents would sign over the last shreds of their liberties and privacy (on p. 304 he explicitly pitches for key escrow and back doors in encryption products, arguing “there's no need for this technology to be any more intrusive than a wiretap on a phone line”—indeed!), the righteous and incorruptible enforcers of the law and impartial arbiters of justice could make their lives ever so much safer and fret-free. And perhaps if the human beings in possession of those awesome powers were, in fact, as righteous as Mr. Freeh seems to believe himself to be, then there would nothing to worry about. But evidence suggests cause for concern. On the next to last page of the book, p. 324, near the end of six pages of acknowledgements set in small type with narrow leading (didn't think we'd read that far, Mr. Freeh?), we find the author naming, as an exemplar of one of the “courageous and honorable men who serve us”, who “deserve the nation's praise and lasting gratitude”, one Lon Horiuchi, the FBI sniper who shot and killed Vicki Weaver (who was accused of no crime) while she was holding her baby in her hands during the Ruby Ridge siege in August of 1992. Horiuchi later pled the Fifth Amendment in testimony before the U.S. Senate Judiciary Committee in 1995, ten years prior to Freeh's commendation of him here.

 Permalink

O'Brien, Flann [Brian O'Nolan]. The Dalkey Archive. Normal, IL: Dalkey Archive Press, [1964] 1993. ISBN 1-56478-172-0.
What a fine book to be reading on Saint Patrick's Day! Flann O'Brien (a nom de plume of Brian O'Nolan, who also wrote under the name Myles na gCopaleen, among others) is considered one of the greatest Irish authors of humor and satire in the twentieth century; James Joyce called him “A real writer, with the true comic spirit.” In addition to his novels, he wrote short stories, plays, and a multitude of newspaper columns in both the Irish and English languages. The Dalkey Archive is a story of mind-bending fantasy and linguistic acrobatics yet so accessible it sucks the reader into its alternative reality almost unsuspecting. A substantial part of the material is recycled from The Third Policeman (January 2004) which, although completed in 1940, the author despaired of ever seeing published (it was eventually published posthumously in 1967). Both novels are works of surreal fantasy, but The Dalkey Archive is more conventionally structured and easier to get into, much as John Brunner's The Jagged Orbit stands in relation to his own earlier and more experimental Stand on Zanzibar.

The mad scientist De Selby, who appears offstage and in extensive and highly eccentric footnotes in The Third Policeman, is a key character here, joined by Saint Augustine and James Joyce. The master of malaprop, Sergeant Fottrell and his curious “mollycule” theory about people and bicycles is here as well, providing a stolid counterpoint to De Selby's relativistic pneumatic theology and diabolical designs. It takes a special kind of genius to pack this much weirdness into only two hundred pages. If you're interested in O'Brien's curious career, this biography is an excellent starting point which contains no spoilers for any of his fiction.

 Permalink

Reynolds, Glenn. An Army of Davids. Nashville: Nelson Current, 2006. ISBN 1-59555-054-2.
In this book, law professor and über blogger (InstaPundit.com) Glenn Reynolds explores how present and near-future technology is empowering individuals at the comparative expense of large organisations in fields as diverse as retailing, music and motion picture production, national security, news gathering, opinion journalism, and, looking further out, nanotechnology and desktop manufacturing, human longevity and augmentation, and space exploration and development (including Project Orion [pp. 228–233]—now there's a garage start-up I'd love to work on!). Individual empowerment is, like the technology which creates it, morally neutral: good people can do more good, and bad people can wreak more havoc. Reynolds is relentlessly optimistic, and I believe justifiably so; good people outnumber bad people by a large majority, and in a society which encourages them to be “a pack, not a herd” (the title of chapter 5), they will have the means in their hands to act as a societal immune system against hyper-empowered malefactors far more effective than heavy-handed top-down repression and fear-motivated technological relinquishment.

Anybody who's seeking “the next big thing” couldn't find a better place to start than this book. Chapters 2, 3 and 7, taken together, provide a roadmap for the devolution of work from downtown office towers to individual entrepreneurs working at home and in whatever environments attract them, and the emergence of “horizontal knowledge”, supplanting the top-down one-to-many model of the legacy media. There are probably a dozen ideas for start-ups with the potential of eBay and Amazon lurking in these chapters if you read them with the right kind of eyes. If the business and social model of the twenty-first century indeed comes to resemble that of the eighteenth, all of those self-reliant independent people are going to need lots of products and services they will find indispensable just as soon as somebody manages to think of them. Discovering and meeting these needs will pay well.

The “every person an entrepreneur” world sketched here raises the same concerns I expressed in regard to David Bolchover's The Living Dead (January 2006): this will be a wonderful world, indeed, for the intelligent and self-motivated people who will prosper once liberated from corporate cubicle indenture. But not everybody is like that: in particular, those people tend to be found on the right side of the bell curve, and for every one on the right, there's one equally far to the left. We have already made entire categories of employment for individuals with average or below-average intelligence redundant. In the eighteenth century, there were many ways in which such people could lead productive and fulfilling lives; what will they do in the twenty-first? Further, ever since Bismarck, government schools have been manufacturing worker-bees with little initiative, and essentially no concept of personal autonomy. As I write this, the élite of French youth is rioting over a proposal to remove what amounts to a guarantee of lifetime employment in a first job. How will people so thoroughly indoctrinated in collectivism fare in an individualist renaissance? As a law professor, the author spends much of his professional life in the company of high-intelligence, strongly-motivated students, many of whom contemplate an entrepreneurial career and in any case expect to be judged on their merits in a fiercely competitive environment. One wonders if his optimism might be tempered were he to spend comparable time with denizens of, say, the school of education. But the fact that there will be problems in the future shouldn't make us fear it—heaven knows there are problems enough in the present, and the last century was kind of a colossal monument to disaster and tragedy; whatever the future holds, the prescription of more freedom, more information, greater wealth and health, and less coercion presented here is certain to make it a better place to live.

The individualist future envisioned here has much in common with that foreseen in the 1970s by Timothy Leary, who coined the acronym “SMIILE” for “Space Migration, Intelligence Increase, Life Extension”. The “II” is alluded to in chapter 12 as part of the merging of human and machine intelligence in the singularity, but mightn't it make sense, as Leary advocated, to supplement longevity research with investigation of the nature of human intelligence and near-term means to increase it? Realising the promise and avoiding the risks of the demanding technologies of the future are going to require both intelligence and wisdom; shifting the entire bell curve to the right, combined with the wisdom of longer lives may be key in achieving the much to be desired future foreseen here.

InstaPundit visitors will be familiar with the writing style, which consists of relatively brief discussion of a multitude of topics, each with one or more references for those who wish to “read the whole thing” in more depth. One drawback of the print medium is that although many of these citations are Web pages, to get there you have to type in lengthy URLs for each one. An on-line edition of the end notes with all the on-line references as clickable links would be a great service to readers.

 Permalink

Buckley, Christopher. Florence of Arabia. New York: Random House, 2004. ISBN 0-8129-7226-0.
This is a very funny novel, and thought-provoking as well. Some speak of a “clash of civilisations” or “culture war” between the Western and Islamic worlds, but with few exceptions the battle has been waged inadvertently by the West, through diffusion of its culture through mass media and globalised business, and indirectly by Islam, through immigration without assimilation into Western countries. Suppose the West were to say, “OK, you want a culture war? Here's a culture war!” and target one of fundamentalist Islam's greatest vulnerabilities: its subjugation and oppression of women?

In this story, the stuck-on-savage petroleum superpower Royal Kingdom of Wasabia cuts off one head too many when they execute a woman who had been befriended by Foreign Service staffer Florence Farfaletti, herself an escapee from trophy wife status in the desert kingdom, who hammers out a fifty-page proposal titled “Female Emancipation as a Means of Achieving Long-Term Political Stability in the Near East” and, undiplomatically vaulting over heaven knows how many levels of bureaucrats and pay grades, bungs it into the Secretary of State's in-box. Bold initiatives of this kind are not in keeping with what State does best, which is nothing, but Florence's plan comes to the attention of the mysterious “Uncle Sam” who appears to have unlimited financial resources at his command and the Washington connections to make just about anything happen.

This sets things in motion, and soon Florence and her team, including a good ole' boy ex-CIA killer, Foreign Service officer who detests travel, and public relations wizard so amoral his slime almost qualifies him for OPEC membership, are set up in the Emirate of Matar, “Switzerland of the Gulf”, famed for its duty-free shopping, offshore pleasure domes at “Infidel Land”, and laid-back approach to Islam by clergy so well-compensated for their tolerance they're nicknamed “moolahs”. The mission? To launch TVMatar, a satellite network targeting Arab women, headed by the wife of the Emir, who was a British TV presenter before marrying the randy royal.

TVMatar's programming is, shall we say, highly innovative, and before long things are bubbling on both sides of the Wasabi/Matar border, with intrigue afoot on all sides, including Machiavellian misdirection by those masters of perfidy, the French. And, of course (p. 113), “This is the Middle East! … Don't you understand that since the start of time, startin' with the Garden of Eden, nothing has ever gone right here?” Indeed, before long, a great many things go all pear-shaped, with attendant action, suspense, laughs, and occasional tragedy. As befits a comic novel, in the end all is resolved, but many are the twists and turns to get there which will keep you turning pages, and there are delightful turns of phrase throughout, from CIA headquarters christened the “George Bush Center for Intelligence” in the prologue to Shem, the Camel Royal…but I mustn't spoil that for you.

This is a delightful read, laugh out loud funny, and enjoyable purely on that level. But in a world where mobs riot, burn embassies, and murder people over cartoons, while pusillanimous European politicians cower before barbarism and contemplate constraining liberties their ancestors bequeathed to humanity in the Enlightenment, one cannot help but muse, “OK, you want a culture war?”

 Permalink

Larson, Erik. The Devil in the White City. New York: Vintage Books, 2003. ISBN 0-375-72560-1.
It's conventional wisdom in the publishing business that you never want a book to “fall into the crack” between two categories: booksellers won't know where to shelve it, promotional campaigns have to convey a complicated mixed message, and you run the risk of irritating readers who bought it solely for one of the two topics. Here we have a book which evokes the best and the worst of the Gilded Age of the 1890s in Chicago by interleaving the contemporary stories of the 1893 World's Columbian Exposition and the depraved series of murders committed just a few miles from the fairgrounds by the archetypal American psychopathic serial killer, the chillingly diabolical Dr. H. H. Holmes (the principal alias among many used by a man whose given name was Herman Webster Mudgett; his doctorate was a legitimate medical degree from the University of Michigan). Architectural and industrial history and true crime are two genres you might think wouldn't mix, but in the hands of the author they result in a compelling narrative which I found as difficult to put down as any book I have read in the last several years. For once, this is not just my eccentric opinion; at this writing the book has been on The New York Times Best-Seller list for more than two consecutive years and won the Edgar award for best fact crime in 2004. As I rarely frequent best-seller lists, it went right under my radar. Special thanks to the visitor to this page who recommended I read it!

Boosters saw the Columbian Exposition not so much as a commemoration of the 400th anniversary of the arrival of Columbus in the New World but as a brash announcement of the arrival of the United States on the world stage as a major industrial, commercial, financial, and military power. They viewed the 1889 Exposition Universelle in Paris (for which the Eiffel Tower was built) as a throwing down of the gauntlet by the Old World, and vowed to assert the preeminence of the New by topping the French and “out-Eiffeling Eiffel”. Once decided on by Congress, the site of the exposition became a bitterly contested struggle between partisans of New York, Washington, and Chicago, with the latter seeing its victory as marking its own arrival as a peer of the Eastern cities who looked with disdain at what Chicagoans considered the most dynamic city in the nation.

Charged with building the Exposition, a city in itself, from scratch on barren, wind-swept, marshy land was architect Daniel H. Burnham, he who said, “Make no little plans; they have no magic to stir men's blood.” He made no little plans. The exposition was to have more than 200 buildings in a consistent neo-classical style, all in white, including the largest enclosed space ever constructed. While the electric light was still a novelty, the fair was to be illuminated by the the first large-scale application of alternating current. Edison's kinetoscope amazed visitors with moving pictures, and a theatre presented live music played by an orchestra in New York and sent over telephone wires to Chicago. Nikola Tesla amazed fairgoers with huge bolts of electrical fire, and a giant wheel built by a man named George Washington Gale Ferris lifted more than two thousand people at once into the sky to look down upon the fair like gods. One of the army of workers who built the fair was a carpenter named Elias Disney, who later regaled his sons Roy and Walt with tales of the magic city; they must have listened attentively.

The construction of the fair in such a short time seemed miraculous to onlookers (and even more so to those accustomed to how long it takes to get anything built a century later), but the list of disasters, obstacles, obstructions, and outright sabotage which Burnham and his team had to overcome was so monumental you'd have almost thought I was involved in the project! (Although if you've ever set up a trade show booth in Chicago, you've probably gotten a taste of it.) A total of 27.5 million people visited the fair between May and October of 1893, and this in a country whose total population (1890 census) was just 62.6 million. Perhaps even more astonishing to those acquainted with comparable present-day undertakings, the exposition was profitable and retired all of its bank debt.

While the enchanted fair was rising on the shore of Lake Michigan and enthralling visitors from around the world, in a gloomy city block size building not far away, Dr. H. H. Holmes was using his almost preternatural powers to charm the young, attractive, and unattached women who flocked to Chicago from the countryside in search of careers and excitement. He offered them the former in various capacities in the businesses, some legitimate and other bogus, in his “castle”, and the latter in his own person, until he killed them, disposed of their bodies, and in some cases sold their skeletons to medical schools. Were the entire macabre history of Holmes not thoroughly documented in court proceedings, investigators' reports, and reputable contemporary news items, he might seem to be a character from an over-the-top Gothic novel, like Jack the Ripper. But wait—Jack the Ripper was real too. However, Jack the Ripper is only believed to have killed five women; Holmes is known for certain to have killed nine men, women, and children. He confessed to killing 27 in all, but this was the third of three mutually inconsistent confessions all at variance with documented facts (some of those he named in the third confession turned up alive). Estimates ran as high as two hundred, but that seems implausible. In any case, he was a monster the likes of which no American imagined inhabited their cities until his crimes were uncovered. Remarkably, and of interest to libertarians who advocate the replacement of state power by insurance-like private mechanisms, Holmes never even came under suspicion by any government law enforcement agency during the entire time he committed his murder spree, nor did any of his other scams (running out on debts, forging promissory notes, selling bogus remedies) attract the attention of the law. His undoing was when he attempted insurance fraud (one of his favourite activities) and ended up with Nemesis-like private detective Frank Geyer on his trail. Geyer, through tireless tracking and the expenditure of large quantities of shoe leather, got the goods on Holmes, who met his end on the gallows in May of 1896. His jailers considered him charming.

I picked this book up expecting an historical recounting of a rather distant and obscure era. Was I ever wrong—I finished the whole thing in two and half days; the story is that fascinating and the writing that good. More than 25 pages of source citations and bibliography are included, but this is not a dry work of history; it reads like a novel. In places, the author has invented descriptions of events for which no eyewitness account exists; he says that in doing this, his goal is to create a plausible narrative as a prosecutor does at a trial. Most such passages are identified in the end notes and justifications given for the inferences made therein. The descriptions of the Exposition cry out for many more illustrations than are included: there isn't even a picture of the Ferris wheel! If you read this book, you'll probably want to order the Dover Photographic Record of the Fair—I did.

 Permalink

April 2006

Levitt, Steven D. and Stephen J. Dubner. Freakonomics. New York: William Morrow, 2005. ISBN 0-06-073132-X.
Finally—a book about one of my favourite pastimes: mining real-world data sets for interesting correlations and searching for evidence of causality—and it's gone and become a best-seller! Steven Levitt is a University of Chicago economics professor who excels in asking questions others never think to pose such as, “If dealing crack is so profitable, why do most drug dealers live with their mothers?” and “Why do real estate agents leave their own houses on the market longer than the houses of their clients?”, then crunches the numbers to answer them, often with fascinating results. Co-author Stephen Dubner, who has written about Levitt's work for The New York Times Magazine, explains Levitt's methodologies in plain language that won't scare away readers inclined to be intimidated by terms such as “multiple regression analysis” and “confidence level”.

Topics run the gamut from correlation between the legalisation of abortion and a drop in the crime rate, cheating in sumo wrestling in Japan, tournament dynamics in advancement to managerial positions in the crack cocaine trade, Superman versus the Ku Klux Klan, the generation-long trajectory of baby names from prestigious to down-market, and the effects of campaign spending on the outcome of elections. In each case there are surprises in store, and sufficient background to understand where the results came from and the process by which they were obtained. The Internet has been a godsend for this kind of research: a wealth of public domain data in more or less machine-readable form awaits analysis by anybody curious about how it might fit together to explain something. This book is an excellent way to get your own mind asking such questions.

My only quibble with the book is the title: “Freakonomics: A Rogue Economist Explores the Hidden Side of Everything.” The only thing freaky about Levitt's work is that so few other professional economists are using the tools of their profession to ask and answer such interesting and important questions. And as to “rogue economist”, that's a rather odd term for somebody with degrees from Harvard and MIT, who is a full professor in one of the most prestigious departments of economics in the United States, recipient of the Clark Medal for best American economist under forty, and author of dozens of academic publications in the leading journals. But book titles, after all, are marketing tools, and the way this book is selling, I guess the title is doing its job quite well, thank you. A web site devoted to the book contains additional information and New York Times columns by the authors containing additional analyses.

 Permalink

Verne, Jules. Voyage à reculons en Angleterre et en Écosse. Paris: Le Cherche Midi, 1989. ISBN 2-86274-147-7.
As a child, Jules Verne was fascinated by the stories of his ancestor who came to France from exotic Scotland to serve as an archer in the guard of Louis XI. Verne's attraction to Scotland was reinforced by his life-long love of the novels of Sir Walter Scott, and when in 1859, at age 31, he had a chance to visit that enchanting ancestral land, he jumped at the opportunity. This novel is a thinly fictionalised account of his “backwards voyage” to Scotland and England. “Backwards” («à reculons») because he and his travelling companion began their trip from Paris into the North by heading South to Bordeaux, where they had arranged economical passage on a ship bound for Liverpool, then on to Edinburgh, Glasgow, and then back by way of London and Dieppe—en sens inverse of most Parisian tourists. The theme of “backwards” surfaces regularly in the narrative, most amusingly on p. 110 where they find themselves advancing to the rear after having inadvertently wandered onto a nude beach.

So prolific was Jules Verne that more than a century and a half after he began his writing career, new manuscripts keep turning up among his voluminous papers. In the last two decades, Paris au XXe siècle, the original un-mangled version of La chasse au météore (October 2002), and the present volume have finally made their way into print. Verne transformed the account of his own trip into a fictionalised travel narrative of a kind quite common in the 19th century but rarely encountered today. The fictional form gave him freedom to add humour, accentuate detail, and highlight aspects of the country and culture he was visiting without crossing the line into that other venerable literary genre, the travel tall tale. One suspects that the pub brawl in chapter 16 is an example of such embroidery, along with the remarkable steam powered contraption on p. 159 which prefigured Mrs. Tweedy's infernal machine in Chicken Run. The description of the weather, however, seems entirely authentic. Verne offered the manuscript to Hetzel, who published most of his work, but it was rejected and remained forgotten until it was discovered in a cache of Verne papers acquired by the city of Nantes in 1981. This 1989 edition is its first appearance in print, and includes six pages of notes on the history of the work and its significance in Verne's œuvre, notes on changes in the manuscript made by Verne, and a facsimile manuscript page.

What is remarkable in reading this novel is the extent to which it is a fully-developed “template” for Verne's subsequent Voyages extraordinaires: here we have an excitable and naïve voyager (think Michel Ardan or Passepartout) paired with a more stolid and knowledgeable companion (Barbicane or Phileas Fogg), the encyclopedist's exultation in enumeration, fascination with all forms of locomotion, and fun with language and dialect (particularly poor Jacques who beats the Dickens out of the language of Shakespeare). Often, when reading the early works of writers, you sense them “finding their voice”—not here. Verne is in full form, the master of his language and the art of story-telling, and fully ready, a few years later, with just a change of topic, to invent science fiction. This is not “major Verne”, and you certainly wouldn't want to start with this work, but if you've read most of Verne and are interested in how it all began, this is genuine treat.

This book is out of print. If you can't locate a used copy at a reasonable price at the Amazon link above, try abebooks.com. For comparison with copies offered for sale, the cover price in 1989 was FRF 95, which is about €14.50 at the final fixed rate.

 Permalink

Wright, Evan. Generation Kill. New York: Berkley Caliber, 2004. ISBN 0-425-20040-X.
The author was an “embedded journalist” with Second Platoon, Bravo Company of the U.S. First Marine Reconnaissance Battalion from a week before the invasion of Iraq in March of 2003 through the entire active combat phase and subsequent garrison duty in Baghdad until the end of April. This book is an expanded edition of his National Magazine Award winning reportage in Rolling Stone. Recon Marines are the elite component of the U.S. Marine Corps—like Army Special Forces or Navy SEALs; there are only about a thousand Recon Marines in the entire 180,000 strong Corps. In the invasion of Iraq, First Recon was used—some say misused—as the point of the spear, often the lead unit in their section of the conflict, essentially inviting ambushes by advancing into suspected hostile terrain.

Wright accompanied the troops 24/7 throughout their mission, sharing their limited rations, sleeping in the same “Ranger graves”, and risking the enemy fire, incoming mortar rounds, and misdirected friendly artillery and airstrikes alongside the Marines. This is 100% grunt-level boots on the ground reportage in the tradition of Ernie Pyle and Bill Mauldin, and superbly done. If you're looking for grand strategy or “what it all means”, you won't find any of that: only the confusing and often appalling face of war as seen through the eyes of the young men sent to fight it. The impression you're left with of the troops (and recall, these are elite warriors of a military branch itself considered elite) is one of apolitical professionalism. You don't get the slightest sense they're motivated by patriotism or a belief they're defending their country or its principles; they're there to do their job, however messy and distasteful. One suspects you'd have heard much the same from the Roman legionnaires who occupied this land almost nineteen centuries ago.

The platoon's brief stay in post-conquest Baghdad provides some insight into why war-fighters, however they excel at breaking stuff and killing people, are as ill-suited to the tasks of nation building, restoring civil order, and promoting self-government as a chainsaw is for watchmaking. One begins to understand how it can be that three years after declaring victory in Iraq, a military power which was able to conquer the entire country in less than two weeks has yet to assert effective control over its capital city.

 Permalink

Bannier, Pierre. Pleins feux sur… Columbo. Paris: Horizon illimité, 2005. ISBN 2-84787-141-1.
It seems like the most implausible formula for a successful television series: no violence, no sex, no car chases, a one-eyed hero who is the antithesis of glamorous, detests guns, and drives a beat-up Peugeot 403. In almost every episode the viewer knows “whodunit” before the detective appears on the screen, and in most cases the story doesn't revolve around his discovery of the perpetrator, but rather obtaining evidence to prove their guilt, the latter done without derring-do or scientific wizardry, but rather endless, often seemingly aimless dialogue between the killer and the tenacious inspector. Yet “Columbo”, which rarely deviated from this formula, worked so well it ran (including pilot episodes) for thirty-five years in two separate series (1968–1978 and 1989–1994) and subsequent telefilm specials through 2003 (a complete episode guide is available online).

Columbo, as much a morality play about persistence and cunning triumphing over the wealthy, powerful, and famous as it is a mystery (creators of the series Richard Levinson and William Link said the character was inspired by Porfiry Petrovich in Dostoyevsky's Crime and Punishment and G. K. Chesterton's Father Brown mysteries), translates well into almost any language and culture. This book provides the French perspective on the phénomène Columbo. In addition to a comprehensive history of the character and series (did you know that the character which became Columbo first appeared in a story in Alfred Hitchcock's Mystery Magazine in 1960, or that Peter Falk was neither the first nor the second, but the third actor to portray Columbo?), details specific to l'Hexagone abound: a profile of Serge Sauvion, the actor who does the uncanny French doublage of Peter Falk's voice in the series, Marc Gallier, the “French Columbo”, and the stage adaptation in 2005 of Une femme de trop (based on the original stage play by Levinson and Link which became the pilot of the television series) starring Pascal Brunner. This being a French take on popular culture, there is even a chapter (pp. 74–77) providing a Marxish analysis of class conflict in Columbo! A complete episode guide with both original English and French titles and profiles of prominent guest villains rounds out the book.

For a hardcover, glossy paper, coffee table book, many of the colour pictures are hideously reproduced; they look like they were blown up from thumbnail images found on the Internet with pixel artefacts so prominent that in some cases you can barely make out what the picture is supposed to be. Other illustrations desperately need the hue, saturation, and contrast adjustment you'd expect to be routine pre-press steps for a publication of this type and price range. There are also a number of errors in transcribing English words in the text—sadly, this is not uncommon in French publications; even Jules Verne did it.

 Permalink

Kurlansky, Mark. 1968 : The Year That Rocked the World. New York: Random House, 2004. ISBN 0-345-45582-7.
In the hands of an author who can make an entire book about Salt (February 2005) fascinating, the epochal year of 1968 abounds with people, events, and cultural phenomena which make for a compelling narrative. Many watershed events in history: war, inventions, plague, geographical discoveries, natural disasters, economic booms and busts, etc. have causes which are reasonably easy to determine. But 1968, like the wave of revolutions which swept Europe in 1848 (January 2002), seems to have been driven by a zeitgeist—a spirit in the air which independently inspired people to act in a common way.

The nearly simultaneous “youthquake” which shook societies as widespread and diverse as France, Poland, Mexico, Czechoslovakia, Spain, and the United States, and manifested itself in radical social movements: antiwar, feminism, black power, anti-authoritarianism, psychedelic instant enlightenment, revolutionary and subversive music, and the emergence of “the whole world is watching” wired planetary culture of live satellite television, all of which continue to reverberate today, seemed so co-ordinated that politicians from Charles de Gaulle, Mexican el presidente Díaz Ordaz, and Leonid Brezhnev were convinced it must be the result of deliberate subversion by their enemies, and were motivated to repressive actions which, in the short term, only fed the fire. In fact, most of the leaders of the various youth movements (to the extent they can be called “leaders”—in those individualistic and anarchistic days, most disdained the title) had never met, and knew about the actions of one another only from what they saw on television. Radicals in the U.S. were largely unaware of the student movement in Mexico before it exploded into televised violence in October.

However the leaders of 1968 may have viewed themselves, in retrospect they were for the most part fascinating, intelligent, well-educated, motivated by a desire to make the world a better place, and optimistic that they could—nothing like the dour, hateful, contemptuous, intolerant, and historically and culturally ignorant people one so often finds today in collectivist movements which believe themselves descended from those of 1968. Consider Mark Rudd's famous letter to Grayson Kirk, president of Columbia University, which ended with the memorable sentence, “I'll use the words of LeRoi Jones, whom I'm sure you don't like a whole lot: ‘Up against the wall, mother****er, this is a stick-up.’” (p. 197), which shocked his contemporaries with the (quoted) profanity, but strikes readers today mostly for the grammatically correct use of “whom”. Who among present-day radicals has the eloquence of Mario Savio's “There's a time when the operation of the machine becomes so odious, makes you so sick at heart, that you can't take part, you can't even tacitly take part, and you've got to put your bodies upon the gears and upon the wheels, upon the levers, upon all the apparatus, and you've got to make it stop” (p. 92), yet had the politeness to remove his shoes to avoid damaging the paint before jumping on a police car to address a crowd. In the days of the Free Speech Movement, who would have imagined some of those student radicals, tenured professors four decades later, enacting campus speech codes and enforcing an intellectual monoculture on their own students?

It is remarkable to read on p. 149 how the French soixante-huitards were “dazzled” by their German contemporaries: “We went there and they had their banners and signs and their security forces and everything with militaristic tactics. It was new to me and the other French.” One suspects they weren't paying attention when their parents spoke of the spring of 1940! Some things haven't changed: when New Left leaders from ten countries finally had the opportunity to meet one another at a conference sponsored by the London School of Economics and the BBC (p. 353), the Americans dismissed the Europeans as all talk and no action, while the Europeans mocked the U.S. radicals' propensity for charging into battle without thinking through why, what the goal was supposed to be, or how it was to be achieved.

In the introduction, the author declares his sympathy for the radical movements of 1968 and says “fairness is possible but true objectivity is not”. And, indeed, the book is written from the phrasebook of the leftist legacy media: good guys are “progressives” and “activists”, while bad guys are “right wingers”, “bigots”, or “reactionaries”. (What's “progressive” ought to depend on your idea of progress. Was SNCC's expulsion of all its white members [p. 96] on racial grounds progress?) I do not recall a single observation which would be considered outside the box on the editorial page of the New York Times. While the book provides a thorough recounting of the events and acquaintance with the principal personalities involved, for me it failed to evoke the “anything goes”, “everything is possible” spirit of those days—maybe you just had to have been there. The summation is useful for correcting false memories of 1968, which ended with both Dubček and de Gaulle still in power; the only major world leader defeated in 1968 was Lyndon Johnson, and he was succeeded by Nixon. A “whatever became of” or “where are they now” section would be a useful addition; such information, when it's given, is scattered all over the text.

One wonders whether, in our increasingly interconnected world, something like 1968 could happen again. Certainly, that's the dream of greying radicals nostalgic for their days of glory and young firebrands regretful for having been born too late. Perhaps better channels of communication and the collapse of monolithic political structures have resulted in change becoming an incremental process which adapts to the evolving public consensus before a mass movement has time to develop. It could simply be that the major battles of “liberation” have all been won, and the next major conflict will be incited by those who wish to rein them in. Or maybe it's just that we're still trying to digest the consequences of 1968 and far from ready for another round.

 Permalink

Smith, Edward E. Second Stage Lensmen. Baltimore: Old Earth Books, [1941–1942, 1953] 1998. ISBN 1-882968-13-1.
This is the fifth installment of the Lensman series, following Triplanetary (June 2004), First Lensman (February 2005), Galactic Patrol (March 2005), and Gray Lensman (August 2005). Second Stage Lensmen ran in serial form in Astounding Science Fiction from November 1941 through February 1942. This book is a facsimile of the illustrated 1953 Fantasy Press edition, which was revised from the original magazine serial.

The only thing I found disappointing when rereading this book in my fourth lifetime expedition through the Lensman saga is knowing there's only one volume of the main story remaining—but what a yarn that is. In Second Stage Lensmen, Doc Smith more overtly adopts the voice of “historian of civilisation” and from time to time departs from straight story-telling to describe off-stage action, discuss his “source material”, and grouse about Galactic Patrol secrecy depriving him of important documents. Still, there's enough rays and shields space opera action for three or four normal novels, although the focus increasingly shifts from super-weapons and shoot-em-ups to mental combat, indirection, and espionage.

It's here we first meet Nadreck, one of the most fascinating of Doc Smith's creations: a poison-breathing cryogenic being who extends into the fourth dimension and considers cowardice and sloth among his greatest virtues. His mind, however, like Kinnison's, honed to second stage Lensman capability by Mentor of Arisia, is both powerful and subtle, and Nadreck a master of boring within without the villains even suspecting his presence. He gets the job done, despite never being satisfied with his “pitifully imperfect” performance. I've known programmers like that.

Some mystery and thriller writers complain of how difficult the invention of mobile phones has made their craft. While it used to be easy for characters to be out of touch and operating with incomplete and conflicting information, now the reader immediately asks, “Why didn't she just pick up the phone and ask?” But in the Lensman universe, both the good guys and (to a lesser extent) the blackguards have instantaneous, mind-to-mind high bandwidth communication on an intergalactic scale, and such is Doc Smith's mastery of his craft that it neither reduces the suspense nor strains the plot, and he makes it look almost effortless.

Writing in an age where realistic women of any kind were rare in science fiction, Smith was known for his strong female characters—on p. 151 he observes, “Indeed, it has been argued that sexual equality is the most important criterion of that which we know as Civilization”—no postmodern multi-culti crapola here! Some critics carped that his women characters were so strong and resourceful they were just male heroes without the square jaws and broad shoulders. So here, probably in part just to show he can do it, we have Illona of Lonabar, a five-sigma airhead bimbo (albeit with black hair, not blonde), and the mind-murdering matriarchy of Lyrane, who have selectively bred their males to be sub-sentient dwarves with no function other than reproduction.

The author's inexhaustible imagination manages to keep these stories up to date, even more than half a century on. While the earlier volumes stressed what would decades later be called low-observable or stealth technology, in this outing he anticipates today's hot Pentagon buzzword, “network-centric warfare”: the grand battles here are won not by better weapons or numbers, but by the unique and top secret information technology of the Z9M9Z Directrix command vessel. The bizarre excursion into “Nth-space” may have seemed over the top to readers in the 1940s, but today it's reminiscent of another valley in the cosmic landscape of string theory.

Although there is a fifteen page foreword by the author which recaps the story to date, you don't really want to start with this volume: there's just too much background and context you'll have missed. It's best either to start at the beginning with Triplanetary or, if you'd rather defer the two slower-paced “prequels”, with Volume 3, Galactic Patrol, which was the first written and can stand alone.

 Permalink

May 2006

Bonner, William and Addison Wiggin. Empire of Debt. Hoboken, NJ: John Wiley & Sons, 2006. ISBN 0-471-73902-2.
To make any sense in the long term, an investment strategy needs to be informed by a “macro macro” view of the global economic landscape and the grand-scale trends which shape it, as well as a fine sense for nonsense: the bubbles, manias, and unsustainable situations which seduce otherwise sane investors into doing crazy things which will inevitably end badly, although nobody can ever be sure precisely when. This is the perspective the authors provide in this wise, entertaining, and often laugh-out-loud funny book. If you're looking for tips on what stocks or funds to buy or sell, look elsewhere; the focus here is on the emergence in the twentieth century of the United States as a global economic and military hegemon, and the bizarre economic foundations of this most curious empire. The analysis of the current scene is grounded in a historical survey of empires and a recounting of how the United States became one.

The business of empire has been conducted more or less the same way all around the globe over millennia. An imperial power provides a more or less peaceful zone to vassal states, a large, reasonably open market in which they can buy and sell their goods, safe transport for goods and people within the imperial limes, and a common currency, system of weights and measures, and other lubricants of efficient commerce. In return, vassal states finance the empire through tribute: either explicit, or indirectly through taxes, tariffs, troop levies, and other imperial exactions. Now, history is littered with the wreckage of empires (more than fifty are listed on p. 49), which have failed in the time-proven ways, but this kind of traditional empire at least has the advantage that it is profitable—the imperial power is compensated for its services (whether welcome or appreciated by the subjects or not) by the tribute it collects from them, which may be invested in further expanding the empire.

The American empire, however, is unique in all of human history for being funded not by tribute but by debt. The emergence of the U.S. dollar as the global reserve currency, severed from the gold standard or any other measure of actual value, has permitted the U.S. to build a global military presence and domestic consumer society by borrowing the funds from other countries (notably, at the present time, China and Japan), who benefit (at least in the commercial sense) from the empire. Unlike tribute, the debt remains on the balance sheet as an exponentially growing liability which must eventually either be repaid or repudiated. In this environment, international trade has become a system in which (p. 221) “One nation buys things that it cannot afford and doesn't need with money it doesn't have. Another sells on credit to people who already cannot pay and then builds more factories to increase output.” Nobody knows how long the game can go on, but when it ends, it is certain to end badly.

An empire which has largely ceased to produce stuff for its citizens, whose principal export has become paper money (to the tune of about two billion dollars per day at this writing), will inevitably succumb to speculative binges. No sooner had the dot.com mania of the late 1990s collapsed than the residential real estate bubble began to inflate, with houses bought with interest-only mortgages considered “investments” which are “flipped” in a matter of months, and equity extracted by further assumption of debt used to fund current consumption. This contemporary collective delusion is well documented, with perspectives on how it may end.

The entire book is written in an “always on” ironic style, with a fine sense for the absurdities which are taken for wisdom and the charlatans and nincompoops who peddle them to the general public in the legacy media. Some may consider the authors' approach as insufficiently serious for a discussion of an oncoming global financial train wreck but, as they note on p. 76, “There is nothing quite so amusing as watching another man make a fool of himself. That is what makes history so entertaining.” Once you get your head out of the 24 hour news cycle and the political blogs and take the long view, the economic and geopolitical folly chronicled here is intensely entertaining, and the understanding of it imparted in this book is valuable in developing a strategy to avoid its inevitable tragic consequences.

 Permalink

Stephenson, Neal. Cryptonomicon. New York: Perennial, 1999. ISBN 0-380-78862-4.
I've found that I rarely enjoy, and consequently am disinclined to pick up, these huge, fat, square works of fiction cranked out by contemporary super scribblers such as Tom Clancy, Stephen King, and J.K. Rowling. In each case, the author started out and made their name crafting intricately constructed, tightly plotted page-turners, but later on succumbed to a kind of mid-career spread which yields flabby doorstop novels that give you hand cramps if you read them in bed and contain more filler than thriller. My hypothesis is that when a talented author is getting started, their initial books receive the close attention of a professional editor and benefit from the discipline imposed by an individual whose job is to flense the flab from a manuscript. But when an author becomes highly successful—a “property” who can be relied upon to crank out best-seller after best-seller, it becomes harder for an editor to restrain an author's proclivity to bloat and bloviation. (This is not to say that all authors are so prone, but some certainly are.) I mean, how would you feel giving Tom Clancy advice on the art of crafting thrillers, even though Executive Orders could easily have been cut by a third and would probably have been a better novel at half the size.

This is why, despite my having tremendously enjoyed his earlier Snow Crash and The Diamond Age, Neal Stephenson's Cryptonomicon sat on my shelf for almost four years before I decided to take it with me on a trip and give it a try. Hey, even later Tom Clancy can be enjoyed as “airplane” books as long as they fit in your carry-on bag! While ageing on the shelf, this book was one of the most frequently recommended by visitors to this page, and friends to whom I mentioned my hesitation to dive into the book unanimously said, “You really ought to read it.” Well, I've finished it, so now I'm in a position to tell you, “You really ought to read it.” This is simply one of the best modern novels I have read in years.

The book is thick, but that's because the story is deep and sprawling and requires a large canvas. Stretching over six decades and three generations, and melding genera as disparate as military history, cryptography, mathematics and computing, business and economics, international finance, privacy and individualism versus the snooper state and intrusive taxation, personal eccentricity and humour, telecommunications policy and technology, civil and military engineering, computers and programming, the hacker and cypherpunk culture, and personal empowerment as a way of avoiding repetition of the tragedies of the twentieth century, the story defies classification into any neat category. It is not science fiction, because all of the technologies exist (or plausibly could have existed—well, maybe not the Galvanick Lucipher [p. 234; all page citations are to the trade paperback edition linked above. I'd usually cite by chapter, but they aren't numbered and there is no table of contents]—in the epoch in which they appear). Some call it a “techno thriller”, but it isn't really a compelling page-turner in that sense; this is a book you want to savour over a period of time, watching the story lines evolve and weave together over the decades, and thinking about the ideas which underlie the plot line.

The breadth of the topics which figure in this story requires encyclopedic knowledge. which the author demonstrates while making it look effortless, never like he's showing off. Stephenson writes with the kind of universal expertise for which Isaac Asimov was famed, but he's a better writer than the Good Doctor, and that's saying something. Every few pages you come across a gem such as the following (p. 207), which is the funniest paragraph I've read in many a year.

He was born Graf Heinrich Karl Wilhelm Otto Friedrich von Übersetzenseehafenstadt, but changed his name to Nigel St. John Gloamthorpby, a.k.a. Lord Woadmire, in 1914. In his photograph, he looks every inch a von Übersetzenseehafenstadt, and he is free of the cranial geometry problem so evident in the older portraits. Lord Woadmire is not related to the original ducal line of Qwghlm, the Moore family (Anglicized from the Qwghlmian clan name Mnyhrrgh) which had been terminated in 1888 by a spectacularly improbable combination of schistosomiasis, suicide, long-festering Crimean war wounds, ball lightning, flawed cannon, falls from horses, improperly canned oysters, and rogue waves.
On p. 352 we find one of the most lucid and concise explanations I've ever read of why it far more difficult to escape the grasp of now-obsolete technologies than most technologists may wish.
(This is simply because the old technology is universally understood by those who need to understand it, and it works well, and all kinds of electronic and software technology has been built and tested to work within that framework, and why mess with success, especially when your profit margins are so small that they can only be detected by using techniques from quantum mechanics, and any glitches vis-à-vis compatibility with old stuff will send your company straight into the toilet.)
In two sentences on p. 564, he lays out the essentials of the original concept for Autodesk, which I failed to convey (providentially, in retrospect) to almost every venture capitalist in Silicon Valley in thousands more words and endless, tedious meetings.
“ … But whenever a business plan first makes contact with the actual market—the real world—suddenly all kinds of stuff becomes clear. You may have envisioned half a dozen potential markets for your product, but as soon as you open your doors, one just explodes from the pack and becomes so instantly important that good business sense dictates that you abandon the others and concentrate all your efforts.”
And how many New York Times Best-Sellers contain working source code (p, 480) for a Perl program?

A 1168 page mass market paperback edition is now available, but given the unwieldiness of such an edition, how much you're likely to thumb through it to refresh your memory on little details as you read it, the likelihood you'll end up reading it more than once, and the relatively small difference in price, the trade paperback cited at the top may be the better buy. Readers interested in the cryptographic technology and culture which figure in the book will find additional information in the author's Cryptonomicon cypher-FAQ.

 Permalink

Ravitch, Diane. The Language Police. New York: Alfred A. Knopf, 2003. ISBN 0-375-41482-7.
One thing which strikes me, having been outside the United States for fifteen years, is just how dumb people in the U.S. are, particularly those 35 years and younger. By “dumb” I don't mean unintelligent: although there is a genetic component to intelligence, evolution doesn't work quickly enough to make much difference in a generation or two, and there's no evidence for selective breeding for stupidity in any case. No, they are dumb in the sense of being almost entirely ignorant of the literary and cultural heritage upon which their society is founded, and know next to nothing about the history of their own country and the world. Further, and even more disturbing, they don't seem to know how to think. Rational thinking is a skill one learns by practise, and these people never seem to have worked through the intellectual exercises to acquire it, and hence have never discovered the quiet joy of solving problems and figuring things out. (Of course, I am talking in broad generalisations here. In a country as large and diverse as the U.S. there are many, many exceptions, to be sure. But the overall impression of the younger population, exceptions apart, comes across to me as dumb.)

You may choose to attribute this estimation to the jaundiced disdain for young'uns so common among balding geezers like me. But the funny thing is, I observe this only in people who grew up the U.S. I don't perceive anything similar in those raised in continental Europe or Asia. (I'm not so sure about the U.K., and my experience with people from South America and Africa is insufficient to form any conclusions.) Further, this seems to be a relatively new phenomenon; I don't recall perceiving anything like the present level of dumbness among contemporaries when I was in the 20–35 age bracket. If you doubt my estimation of the knowledge and reasoning skills of younger people in the U.S., just cast a glance at the highest moderated comments on one of the online discussion boards such as Slashdot, and bear in mind when doing so that these are the technological élite, not the fat middle of the bell curve. Here is an independent view of younger people in the U.S. which comes to much the same conclusion as I.

What could possibly account for this? Well, it may not be the entire answer, but an important clue is provided by this stunning book by an historian and professor of education at New York University, which documents the exclusion of essentially the entire body of Western culture from the primary and secondary school curriculum starting in around 1970, and the rewriting of history to exclude anything perceived as controversial by any pressure group motivated to involve itself in the textbook and curriculum adoption process, which is described in detail. Apart from a few egregious cases which have come to the attention of the media, this process has happened almost entirely out of the public eye, and an entire generation has now been educated, if you can call it that, with content-free material chosen to meet bizarre criteria of “diversity” and avoid offending anybody. How bad is it? So bad that the president of a textbook company, when asked in 1998 by members of the committee charged with developing a national reading test proposed by President Clinton, why the reading passages chosen contained nothing drawn from classic literature or myth, replied, as if it were the most obvious thing in the world, “everything written before 1970 was either gender biased or racially biased.” So long, Shakespeare; heave-ho Homer! It's no wonder the author of I'm the Teacher, You're the Student (January 2005) discovered so many of his students at a top-tier university had scarcely read a single book before arriving in his classroom: their public school experience had taught them that reading is tedious and books contain only boring, homogenised pablum utterly disconnected from the real world they experience through popular culture and their everyday life.

The author brings no perceptible political bias or agenda to the topic. Indeed, she documents how the ideologues of the right and left form a highly effective pincer movement which squeezes out the content and intellectual stimulation from the material taught in schools, and thus educates those who pass through them that learning is boring, reading is dull, and history is all settled, devoid of controversy, and that every event in the past should be interpreted according to the fashionable beliefs of the present day. The exquisite irony is this is said to be done in the interest of “diversity” when, in fact, the inevitable consequence is the bowdlerisation of the common intellectual heritage into mediocre, boring, and indistinguishable pap. It is also interesting to observe that the fundamental principles upon which the champions of this “diversity” base their arguments—that one's ethnic group identity determines how an individual thinks and learns; that one cannot and should not try to transcend that group identity; that a member of a group can learn only from material featuring members of their own group, ideally written by a group member—are, in fact, identical to those believed by the most vicious of racists. Both reject individualism and the belief that any person, if blessed with the requisite talent and fired by ambition and the willingness to work assiduously toward the goal, can achieve anything at all in a free society.

Instead, we see things like this document, promulgated by the public school system of Seattle, Washington (whose motto is “Academic Achievement for Every Student in Every School”), which provides “Definitions of Racism” in six different categories. (Interesting—the Seattle Public Schools seem to have taken this document down—wonder why? However, you can still view a copy I cached just in case that might happen.) Under “Cultural Racism” we learn that “having a future time orientation, emphasizing individualism as opposed to a more collective ideology, [and] defining one form of English as standard” constitutes “cultural racism”. Some formula for “Academic Achievement for Every Student”, don't you think? (Reading The Language Police is quite enlightening in parsing details such as those in the drawing which appears to the right of the first paragraph of this document. It shows a group of people running a foot race [exercise: good]. Of the four people whose heads are shown, one is a Caucasian female [check], another is an African American male [check], a third is an Hispanic man [check—although the bias and sensitivity guidelines of two major textbook companies (p. 191) would fault this picture because, stereotypically, the man has a moustache], and an older [check] Caucasian male [older people must always be shown as active; never sitting on the porch in a rocking chair]. Two additional figures are shown with their heads lopped off: one an African American woman and the other what appears to be a light-skinned male. Where's the Asian?) Now, this may seem ridiculous, but every major U.S. textbook publisher these days compiles rigorous statistics on the racial and gender mix of both text and illustrations in their books, and adjusts them to precisely conform to percentages from the U.S. census. Intellectual content appears to receive no such scrutiny.

A thirty page appendix provides a list of words, phrases, and concepts banned from U.S. textbooks, including the delightful list (p. 196) of Foods which May Not Be Mentioned in California, including pickles and tea. A second appendix of the same length provides a wonderful list of recommendations of classic literature for study from grades three through ten. Home schoolers will find this a bounty of worthwhile literature to enrich their kids' education and inculcate the love of reading, and it's not a bad place to start for adults who have been deprived of this common literary heritage in their own schooling. A paperback edition is now available.

 Permalink

June 2006

Woit, Peter. Not Even Wrong. London: Jonathan Cape, 2006. ISBN 0-224-07605-1.
Richard Feynman, a man about as difficult to bamboozle on scientific topics as any who ever lived, remarked in an interview (p. 180) in 1987, a year before his death:
…I think all this superstring stuff is crazy and it is in the wrong direction. … I don't like that they're not calculating anything. I don't like that they don't check their ideas. I don't like that for anything that disagrees with an experiment, they cook up an explanation—a fix-up to say “Well, it still might be true.”
Feynman was careful to hedge his remark as being that of an elder statesman of science, who collectively have a history of foolishly considering the speculations of younger researchers to be nonsense, and he would have almost certainly have opposed any effort to cut off funding for superstring research, as it might be right, after all, and should be pursued in parallel with other promising avenues until they make predictions which can be tested by experiment, falsifying and leading to the exclusion of those candidate theories whose predictions are incorrect.

One wonders, however, what Feynman's reaction would have been had he lived to contemplate the contemporary scene in high energy theoretical physics almost twenty years later. String theory and its progeny still have yet to make a single, falsifiable prediction which can be tested by a physically plausible experiment. This isn't surprising, because after decades of work and tens of thousands of scientific publications, nobody really knows, precisely, what superstring (or M, or whatever) theory really is; there is no equation, or set of equations from which one can draw physical predictions. Leonard Susskind, a co-founder of string theory, observes ironically in his book The Cosmic Landscape (March 2006), “On this score, one might facetiously say that String Theory is the ultimate epitome of elegance. With all the years that String Theory has been studied, no one has ever found a single defining equation! The number at present count is zero. We know neither what the fundamental equations of the theory are or even if it has any.” (p. 204). String theory might best be described as the belief that a physically correct theory exists and may eventually be discovered by the research programme conducted under that name.

From the time Feynman spoke through the 1990s, the goal toward which string theorists were working was well-defined: to find a fundamental theory which reproduces at the low energy limit the successful results of the standard model of particle physics, and explains, from first principles, the values of the many (there are various ways to count them, slightly different—the author gives the number as 18 in this work) free parameters of that theory, whose values are not predicted by any theory and must be filled in by experiment. Disturbingly, theoretical work in the early years of this century has convinced an increasing number of string theorists (but not all) that the theory (whatever it may turn out to be), will not predict a unique low energy limit (or “vacuum state”), but rather an immense “landscape” of possible universes, with estimates like 10100 and 10500 and even more bandied around (by comparison, there are only about 1080 elementary particles in the entire observable universe—a minuscule number compared to such as these). Most of these possible universes would be hideously inhospitable to intelligent life as we know and can imagine it (but our imagination may be limited), and hence it is said that the reason we find ourselves in one of the rare universes which contain galaxies, chemistry, biology, and the National Science Foundation is due to the anthropic principle: a statement, bordering on tautology, that we can only observe conditions in the universe which permit our own existence, and that perhaps either in a “multiverse” of causally disjoint or parallel realities, all the other possibilities exist as well, most devoid of observers, at least those like ourselves (triune glorgs, feeding on bare colour in universes dominated by quark-gluon plasma would doubtless deem our universe unthinkably cold, rarefied, and dead).

But adopting the “landscape” view means abandoning the quest for a theory of everything and settling for what amounts to a “theory of anything”. For even if string theorists do manage to find one of those 10100 or whatever solutions in the landscape which perfectly reproduces all the experimental results of the standard model (and note that this is something nobody has ever done and appears far out of reach, with legitimate reasons to doubt it is possible at all), then there will almost certainly be a bewildering number of virtually identical solutions with slightly different results, so that any plausible experiment which measures a quantity to more precision or discovers a previously unknown phenomenon can be accommodated within the theory simply by tuning one of its multitudinous dials and choosing a different solution which agrees with the experimental results. This is not what many of the generation who built the great intellectual edifice of the standard model of particle physics would have considered doing science.

Now if string theory were simply a chimæra being pursued by a small band of double-domed eccentrics, one wouldn't pay it much attention. Science advances by exploring lots of ideas which may seem crazy at the outset and discarding the vast majority which remain crazy after they are worked out in more detail. Whatever remains, however apparently crazy, stays in the box as long as its predictions are not falsified by experiment. It would be folly of the greatest magnitude, comparable to attempting to centrally plan the economy of a complex modern society, to try to guess in advance, by some kind of metaphysical reasoning, which ideas were worthy of exploration. The history of the S-matrix or “bootstrap” theory of the strong interactions recounted in chapter 11 is an excellent example of how science is supposed to work. A beautiful theory, accepted by a large majority of researchers in the field, which was well in accord with experiment and philosophically attractive, was almost universally abandoned in a few years after the success of the quark model in predicting new particles and the stunning deep inelastic scattering results at SLAC in the 1970s.

String theory, however, despite not having made a single testable prediction after more than thirty years of investigation, now seems to risk becoming a self-perpetuating intellectual monoculture in theoretical particle physics. Among the 22 tenured professors of theoretical physics in the leading six faculties in the United States who received their PhDs after 1981, fully twenty specialise in string theory (although a couple now work on the related brane-world models). These professors employ graduate students and postdocs who work in their area of expertise, and when a faculty position opens up, may be expected to support candidates working in fields which complement their own research. This environment creates a great incentive for talented and ambitious students aiming for one the rare permanent academic appointments in theoretical physics to themselves choose string theory, as that's where the jobs are. After a generation, this process runs the risk of operating on its own momentum, with nobody in a position to step back and admit that the entire string theory enterprise, judged by the standards of genuine science, has failed, and does not merit the huge human investment by the extraordinarily talented and dedicated people who are pursuing it, nor the public funding it presently receives. If Edward Witten believes there's something still worth pursuing, fine: his self-evident genius and massive contributions to mathematical physics more than justify supporting his work. But this enterprise which is cranking out hundreds of PhDs and postdocs who are spending their most intellectually productive years learning a fantastically complicated intellectual structure with no grounding whatsoever in experiment, most of whom will have no hope of finding permanent employment in the field they have invested so much to aspire toward, is much more difficult to justify or condone.

The problem, to state it in a manner more inflammatory than the measured tone of the author, and in a word of my choosing which I do not believe appears at all in his book, is that contemporary academic research in high energy particle theory is corrupt. As is usually the case with such corruption, the root cause is socialism, although the look-only-left blinders almost universally worn in academia today hides this from most observers there. Dwight D. Eisenhower, however, twigged to it quite early. In his farewell address of January 17th, 1961, which academic collectivists endlessly cite for its (prescient) warning about the “military-industrial complex”, he went on to say, although this is rarely quoted,

In this revolution, research has become central; it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government.

Today, the solitary inventor, tinkering in his shop, has been over shadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.

The prospect of domination of the nation's scholars by Federal employment, project allocations, and the power of money is ever present and is gravely to be regarded.

And there, of course, is precisely the source of the corruption. This enterprise of theoretical elaboration is funded by taxpayers, who have no say in how their money, taken under threat of coercion, is spent. Which researchers receive funds for what work is largely decided by the researchers themselves, acting as peer review panels. While peer review may work to vet scientific publications, as soon as money becomes involved, the disposition of which can make or break careers, all the venality and naked self- and group-interest which has undone every well-intentioned experiment in collectivism since Robert Owen comes into play, with the completely predictable and tediously repeated results. What began as an altruistic quest driven by intellectual curiosity to discover answers to the deepest questions posed by nature ends up, after a generation of grey collectivism, as a jobs program. In a sense, string theory can be thought of like that other taxpayer-funded and highly hyped program, the space shuttle, which is hideously expensive, dangerous to the careers of those involved with it (albeit in a more direct manner), supported by a standing army composed of some exceptional people and a mass of the mediocre, difficult to close down because it has carefully cultivated a constituency whose own self-interest is invested in continuation of the program, and almost completely unproductive of genuine science.

One of the author's concerns is that the increasingly apparent impending collapse of the string theory edifice may result in the de-funding of other promising areas of fundamental physics research. I suspect he may under-estimate how difficult it is to get rid of a government program, however absurd, unjustified, and wasteful it has become: consider the space shuttle, or mohair subsidies. But perhaps de-funding is precisely what is needed to eliminate the corruption. Why should U.S. taxpayers be spending on the order of thirty million dollars a year on theoretical physics not only devoid of any near- or even distant-term applications, but also mostly disconnected from experiment? Perhaps if theoretical physics returned to being funded by universities from their endowments and operating funds, and by money raised from patrons and voluntarily contributed by the public interested in the field, it would be, albeit a much smaller enterprise, a more creative and productive one. Certainly it would be more honest. Sure, there may be some theoretical breakthrough we might not find for fifty years instead of twenty with massive subsidies. But so what? The truth is out there, somewhere in spacetime, and why does it matter (since it's unlikely in the extreme to have any immediate practical consequences) how soon we find it, anyway? And who knows, it's just possible a research programme composed of the very, very best, whose work is of such obvious merit and creativity that it attracts freely-contributed funds, exploring areas chosen solely on their merit by those doing the work, and driven by curiosity instead of committee group-think, might just get there first. That's the way I'd bet.

For a book addressed to a popular audience which contains not a single equation, many readers will find it quite difficult. If you don't follow these matters in some detail, you may find some of the more technical chapters rather bewildering. (The author, to be fair, acknowledges this at the outset.) For example, if you don't know what the hierarchy problem is, or why it is important, you probably won't be able to figure it out from the discussion here. On the other hand, policy-oriented readers will have little difficulty grasping the problems with the string theory programme and its probable causes even if they skip the gnarly physics and mathematics. An entertaining discussion of some of the problems of string theory, in particular the question of “background independence”, in which the string theorists universally assume the existence of a background spacetime which general relativity seems to indicate doesn't exist, may be found in Carlo Rovelli's "A Dialog on Quantum Gravity". For more technical details, see Lee Smolin's Three Roads to Quantum Gravity. There are some remarkable factoids in this book, one of the most stunning being that the proposed TeV class muon colliders of the future will produce neutrino (yes, neutrino) radiation which is dangerous to humans off-site. I didn't believe it either, but look here—imagine the sign: “DANGER: Neutrino Beam”!

A U.S. edition is scheduled for publication at the end of September 2006. The author has operated the Not Even Wrong Web log since 2004; it is an excellent source for news and gossip on these issues. The unnamed “excitable … Harvard faculty member” mentioned on p. 227 and elsewhere is Luboš Motl (who is, however, named in the acknowledgements), and whose own Web log is always worth checking out.

 Permalink

Bartlett, Bruce. Impostor. New York: Doubleday, 2006. ISBN 0-385-51827-7.
This book is a relentless, uncompromising, and principled attack on the administration of George W. Bush by an author whose conservative credentials are impeccable and whose knowledge of economics and public finance is authoritative; he was executive director of the Joint Economic Committee of Congress during the Reagan administration and later served in the Reagan White House and in the Treasury Department under the first president Bush. For the last ten years he was a Senior Fellow at the National Center for Policy Analysis, which fired him in 2005 for writing this book.

Bartlett's primary interest is economics, and he focuses almost exclusively on the Bush administration's spending and tax policies here, with foreign policy, the wars in Afghanistan and Iraq, social policy, civil liberties, and other contentious issues discussed only to the extent they affect the budget. The first chapter, titled “I Know Conservatives, and George W. Bush Is No Conservative” states the central thesis, which is documented by detailed analysis of the collapse of the policy-making process in Washington, the expensive and largely ineffective tax cuts, the ruinous Medicare prescription drug program (and the shameful way in which its known costs were covered up while the bill was rammed through Congress), the abandonment of free trade whenever there were votes to be bought, the explosion in regulation, and the pork-packed spending frenzy in the Republican controlled House and Senate which Bush has done nothing to restrain (he is the first president since John Quincy Adams to serve a full four year term and never veto a single piece of legislation). All of this is documented in almost 80 pages of notes and source references.

Bartlett is a “process” person as well as a policy wonk, and he diagnoses the roots of many of the problems as due to the Bush White House's resembling a third and fourth Nixon administration. There is the same desire for secrecy, the intense value placed on personal loyalty, the suppression of active debate in favour of a unified line, isolation from outside information and opinion, an attempt to run everything out of the White House, bypassing the policy shops and resources in the executive departments, and the paranoia induced by uniformly hostile press coverage and detestation by intellectual elites. Also Nixonesque is the free-spending attempt to buy the votes, at whatever the cost or long-term consequences, of members of groups who are unlikely in the extreme to reward Republicans for their largesse because they believe they'll always get a better deal from the Democrats.

The author concludes that the inevitable economic legacy of the Bush presidency will be large tax increases in the future, perhaps not on Bush's watch, but correctly identified as the consequences of his irresponsibility when they do come to pass. He argues that the adoption of a European-style value-added tax (VAT) is the “least bad” way to pay the bill when it comes due. The long-term damage done to conservatism and the Republican party are assessed, along with prospects for the post-Bush era.

While Bartlett was one of the first prominent conservatives to speak out against Bush, he is hardly alone today, with disgruntlement on the right seemingly restrained mostly due to lack of alternatives. And that raises a question on which this book is silent: if Bush has governed (at least in domestic economic policy) irresponsibly, incompetently, and at variance with conservative principles, what other potential candidate could have been elected instead who would have been the true heir of the Reagan legacy? Al Gore? John Kerry? John McCain? Steve Forbes? What plausible candidate in either party seems inclined and capable of turning things around instead of making them even worse? The irony, and a fundamental flaw of Empire seems to be that empires don't produce the kind of leaders which built them, or are required to avert their decline. It's fundamentally a matter of crunchiness and sogginess, and it's why empires don't last forever.

 Permalink

Ortega y Gasset, José. The Revolt of the Masses. New York: W. W. Norton, [1930, 1932, 1964] 1993. ISBN 0-393-31095-7.
This book, published more than seventy-five years ago, when the twentieth century was only three decades old, is a simply breathtaking diagnosis of the crises that manifested themselves in that century and the prognosis for human civilisation. The book was published in Spanish in 1930; this English translation, authorised and approved by the author, by a translator who requested to remain anonymous, first appeared in 1932 and has been in print ever since.

I have encountered few works so short (just 190 pages), which are so densely packed with enlightening observations and thought-provoking ideas. When I read a book, if I encounter a paragraph that I find striking, either in the writing or the idea it embodies, I usually add it to my “quotes” archive for future reference. If I did so with this book, I would find myself typing in a large portion of the entire text. This is not an easy read, not due to the quality of the writing and translation (which are excellent), nor the complexity of the concepts and arguments therein, but simply due to the pure number of insights packed in here, each of which makes you stop and ponder its derivation and implications.

The essential theme of the argument anticipated the crunchy/soggy analysis of society by more than 65 years. In brief, over-achieving self-motivated elites create liberal democracy and industrial economies. Liberal democracy and industry lead to the emergence of the “mass man”, self-defined as not of the elite and hostile to existing elite groups and institutions. The mass man, by strength of numbers and through the democratic institutions which enabled his emergence, seizes the levers of power and begins to use the State to gratify his immediate desires. But, unlike the elites who created the State, the mass man does not think or plan in the long term, and is disinclined to make the investments and sacrifices which were required to create the civilisation in the first place, and remain necessary if it is to survive. In this consists the crisis of civilisation, and grasping this single concept explains much of the history of the seven decades which followed the appearance of the book and events today. Suddenly some otherwise puzzling things start to come into focus, such as why it is, in a world enormously more wealthy than that of the nineteenth century, with abundant and well-educated human resources and technological capabilities which dwarf those of that epoch, there seems to be so little ambition to undertake large-scale projects, and why those which are embarked upon are so often bungled.

In a single footnote on p. 119, Ortega y Gasset explains what the brilliant Hans-Hermann Hoppe spent an entire book doing: why hereditary monarchies, whatever their problems, are usually better stewards of the national patrimony than democratically elected leaders. In pp. 172–186 he explains the curious drive toward European integration which has motivated conquerors from Napoleon through Hitler, and collectivist bureaucratic schemes such as the late, unlamented Soviet Union and the odious present-day European Union. On pp. 188–190 he explains why a cult of youth emerges in mass societies, and why they produce as citizens people who behave like self-indulgent perpetual adolescents. In another little single-sentence footnote on p. 175 he envisions the disintegration of the British Empire, then at its zenith, and the cultural fragmentation of the post-colonial states. I'm sure that few of the author's intellectual contemporaries could have imagined their descendants living among the achievements of Western civilisation yet largely ignorant of its history or cultural heritage; the author nails it in chapters 9–11, explaining why it was inevitable and tracing the consequences for the civilisation, then in chapter 12 he forecasts the fragmentation of science into hyper-specialised fields and the implications of that. On pp. 184–186 he explains the strange attraction of Soviet communism for European intellectuals who otherwise thought themselves individualists—recall, this is but six years after the death of Lenin. And still there is more…and more…and more. This is a book you can probably re-read every year for five years in a row and get something more out of it every time.

A full-text online edition is available, which is odd since the copyright of the English translation was last renewed in 1960 and should still be in effect, yet the site which hosts this edition claims that all their content is in the public domain.

 Permalink

Weinberger, Sharon. Imaginary Weapons. New York: Nation Books, 2006. ISBN 1-56025-849-7.

A nuclear isomer is an atomic nucleus which, due to having a greater spin, different shape, or differing alignment of the spin orientation and axis of symmetry, has more internal energy than the ground state nucleus with the same number of protons and neutrons. Nuclear isomers are usually produced in nuclear fusion reactions when the the addition of protons and/or neutrons to a nucleus in a high-energy collision leaves it in an excited state. Hundreds of nuclear isomers are known, but the overwhelming majority decay with gamma ray emission in about 10−14 seconds. In a few species, however, this almost instantaneous decay is suppressed for various reasons, and metastable isomers exist with half-lives ranging from 10−9 seconds (one nanosecond), to the isomer Tantalum-180m, which has a half-life of at least 1015 years and may be entirely stable; it is the only nuclear isomer found in nature and accounts for about one atom of 8300 in tantalum metal.

Some metastable isomers with intermediate half-lives have a remarkably large energy compared to the ground state and emit correspondingly energetic gamma ray photons when they decay. The Hafnium-178m2 (the “m2” denotes the second lowest energy isomeric state) nucleus has a half-life of 31 years and decays (through the m1 state) with the emission of 2.45 MeV in gamma rays. Now the fact that there's a lot of energy packed into a radioactive nucleus is nothing new—people were calculating the energy of disintegrating radium and uranium nuclei at the end of the 19th century, but all that energy can't be used for much unless you can figure out some way to release it on demand—as long as it just dribbles out at random, you can use it for some physics experiments and medical applications, but not to make loud bangs or turn turbines. It was only the discovery of the fission chain reaction, where the fission of certain nuclei liberates neutrons which trigger the disintegration of others in an exponential process, which made nuclear energy, for better or for worse, accessible.

So, as long as there is no way to trigger the release of the energy stored in a nuclear isomer, it is nothing more than an odd kind of radioactive element, the subject of a reasonably well-understood and somewhat boring topic in nuclear physics. If, however, there were some way to externally trigger the decay of the isomer to the ground state, then the way would be open to releasing the energy in the isomer at will. It is possible to trigger the decay of the Tantalum-180 isomer by 2.8 MeV photons, but the energy required to trigger the decay is vastly greater than the 0.075 MeV it releases, so the process is simply an extremely complicated and expensive way to waste energy.

Researchers in the small community interested in nuclear isomers were stunned when, in the January 25, 1999 issue of Physical Review Letters, a paper by Carl Collins and his colleagues at the University of Texas at Dallas reported they had triggered the release of 2.45 MeV in gamma rays from a sample of Hafnium-178m2 by irradiating it with a second-hand dental X-ray machine with the sample of the isomer sitting on a styrofoam cup. Their report implied, even with the crude apparatus, an energy gain of sixty times break-even, which was more than a million times the rate predicted by nuclear theory, if triggering were possible at all. The result, if real, could have substantial technological consequences: the isomer could be used as a nuclear battery, which could store energy and release it on demand with a density which dwarfed that of any chemical battery and was only a couple of orders of magnitude less than a fission bomb. And, speaking of bombs, if you could manage to trigger a mass of hafnium all at once or arrange for it to self-trigger in a chain reaction, you could make a variety of nifty weapons out of it, including a nuclear hand grenade with a yield of two kilotons. You could also build a fission-free trigger for a thermonuclear bomb which would evade all of the existing nonproliferation safeguards which are aimed at controlling access to fissile material. These are the kind of things that get the attention of folks in that big five-sided building in Arlington, Virginia.

And so it came to pass, in a Pentagon bent on “transformational technologies” and concerned with emerging threats from potential adversaries, that in May of 2003 a Hafnium Isomer Production Panel (HIPP) was assembled to draw up plans for bulk production of the substance, with visions of nuclear hand grenades, clean bunker-busting fusion bombs, and even hafnium-powered bombers floating before the eyes of the out of the box thinkers at DARPA, who envisioned a two-year budget of USD30 million for the project—military science marches into the future. What's wrong with this picture? Well, actually rather a lot of things.

  • No other researcher had been able to reproduce the results from the original experiment. This included a team of senior experimentalists who used the Advanced Photon Source at Argonne National Laboratory and state of the art instrumentation and found no evidence whatsoever for triggering of the hafnium isomer with X-rays—in two separate experiments.
  • As noted above, well-understood nuclear theory predicted the yield from triggering, if it occurred, to be six orders of magnitude less than reported in Collins's paper.
  • An evaluation of the original experiment by the independent JASON group of senior experts in 1999 determined the result to be “a priori implausible” and “inconclusive, at best”.
  • A separate evaluation by the Institute for Defense Analyses concluded the original paper reporting the triggering results “was flawed and should not have passed peer review”.
  • Collins had never run, and refused to run, a null experiment with ordinary hafnium to confirm that the very small effect he reported went away when the isomer was removed.
  • James Carroll, one of the co-authors of the original paper, had obtained nothing but null results in his own subsequent experiments on hafnium triggering.
  • Calculations showed that even if triggering were to be possible at the reported rate, the process would not come close to breaking even: more than six times as much X-ray energy would go in as gamma rays came out.
  • Even if triggering worked, and some way were found to turn it into an energy source or explosive device, the hafnium isomer does not occur in nature and would have to be made by a hideously inefficient process in a nuclear reactor or particle accelerator, at a cost estimated at around a billion dollars per gram. The explosive in the nuclear hand grenade would cost tens of billions of dollars, compared to which highly enriched uranium and plutonium are cheap as dirt.
  • If the material could be produced and triggering made to work, the resulting device would pose an extreme radiation hazard. Radiation is inverse to half-life, and the hafnium isomer, with a 31 year half-life, is vastly more radioactive than U-235 (700 million years) or Pu-239 (24,000 years). Further, hafnium isomer decays emit gamma rays, which are the most penetrating form of ionising nuclear radiation and the most difficult against which to shield. The shielding required to protect humans in the vicinity of a tangible quantity of hafnium isomer would more than negate its small mass and compact size.
  • A hafnium explosive device would disperse large quantities of the unreacted isomer (since a relatively small percentage of the total explosive can react before the device is disassembled in the explosion). As it turns out, the half-life of the isomer is just about the same as that of Cesium-137, which is often named as the prime candidate for a “dirty” radiological bomb. One physicist on the HIPP (p. 176) described a hafnium weapon as “the mother of all dirty bombs”.
  • And consider that hand grenade, which would weigh about five pounds. How far can you throw a five pound rock? What do you think about being that far away from a detonation with the energy of two thousand tons of TNT, all released in prompt gamma rays?

But bad science, absurd economics, a nonexistent phenomenon, damning evaluations by panels of authorities, lack of applications, and ridiculous radiation risk in the extremely improbable event of success pose no insurmountable barriers to a government project once it gets up to speed, especially one in which the relationships between those providing the funding and its recipients are complicated and unseemingly cozy. It took an exposé in the Washington Post Magazine by the author and subsequent examination in Congress to finally drive a stake through this madness—maybe. As of the end of 2005, although DARPA was out of the hafnium business (at least publicly), there were rumours of continued funding thanks to a Congressional earmark in the Department of Energy budget.

This book is a well-researched and fascinating look inside the defence underworld where fringe science feeds on federal funds, and starkly demonstrates how weird and wasteful things can get when Pentagon bureaucrats disregard their own science advisors and substitute instinct and wishful thinking for the tedious, but ultimately reliable, scientific method. Many aspects of the story are also quite funny, although U.S. taxpayers who footed the bill for this madness may be less amused. The author has set up a Web site for the book, and Carl Collins, who conducted the original experiment with the dental X-ray and styrofoam cup which incited the mania has responded with his own, almost identical in appearance, riposte. If you're interested in more technical detail on the controversy than appears in Weinberg's book, the Physics Today article from May 2004 is an excellent place to start. The book contains a number of typographical and factual errors, none of which are significant to the story, but when the first line of the Author's Note uses “sited” when “cited” is intended, and in the next paragraph “wondered” instead of “wandered”, you have to—wonder.

It is sobering to realise that this folly took place entirely in the public view: in the open scientific literature, university labs, unclassified defence funding subject to Congressional oversight, and ultimately in the press, and yet over a period of years millions in taxpayer funds were squandered on nonsense. Just imagine what is going on in highly-classified “black” programs.

 Permalink

July 2006

Herrmann, Alexander. Herrmann's Book of Magic. Chicago: Frederick J. Drake & Co., 1903. LCCN 05035787.
When you were a kid, did your grandfather ever pull a coin from his pocket, clap his hands together and make it disappear, then “find” it behind your ear, sending you off to the Popsicle truck for a summer evening treat? If so, and you're now grandparent age yourself, this may be the book from which he learned that trick. Alexander Herrmann was a prominent stage magician in the latter half of the nineteenth century. In this 1903 book, he reveals many of the secrets of the conjuror, from the fundamental sleight of hand skills of palming objects and vanishing and producing them, to the operation of famous illusions such as the disembodied head which speaks. This on-line edition, available both in HTML and Plain ASCII formats, is a complete reproduction of the book, including (in the HTML edition) all the illustrations.

If you must have a printed copy, you may find one at abebooks.com, but it will probably be expensive. It's much better to read the on-line edition produced from a copy found by Bill Walker at a yard sale and kindly contributed to produce this edition.

 Permalink

Berlinski, Claire. Menace in Europe. New York: Crown Forum, 2006. ISBN 1-4000-9768-1.
This is a scary book. The author, who writes with a broad and deep comprehension of European history and its cultural roots, and a vocabulary which reminds one of William F. Buckley, argues that the deep divide which has emerged between the United States and Europe since the end of the cold war, and particularly in the last few years, is not a matter of misunderstanding, lack of sensitivity on the part of the U.S., or the personnel, policies, and style of the Bush administration, but deeply rooted in structural problems in Europe which are getting worse, not better. (That's not to say that there aren't dire problems in the U.S. as well, but that isn't the topic here.)

Surveying the contemporary scene in the Netherlands, Britain, France, Spain, Italy, and Germany, and tracing the roots of nationalism, peasant revolts (of which “anti-globalisation” is the current manifestation), and anti-Semitism back through the centuries, she shows that what is happening in Europe today is simply Europe—the continent of too many kings and too many wars—being Europe, adapted to present-day circumstances. The impression you're left with is that Europe isn't just the “sick man of the world”, but rather a continent afflicted with half a dozen or more separate diseases, all terminal: a large, un-assimilated immigrant population concentrated in ghettos; an unsustainable welfare state; a sclerotic economy weighed down by social charges, high taxes, and ubiquitous and counterproductive regulation; a collapsing birth rate and aging population; a “culture crash” (my term), where the religions and ideologies which have structured the lives of Europeans for millennia have evaporated, leaving nothing in their place; a near-total disconnect between elites and the general population on the disastrous project of European integration, most recently manifested in the controversy over the so-called European constitution; and signs that the rabid nationalism which plunged Europe into two disastrous wars in the last century and dozens, if not hundreds of wars in the centuries before, is seeping back up through the cracks in the foundation of the dystopian, ill-conceived European Union.

In some regards, the author does seem to overstate the case, or generalise from evidence so narrow it lacks persuasiveness. The most egregious example is chapter 8, which infers an emerging nihilist neo-Nazi nationalism in Germany almost entirely based on the popularity of the band Rammstein. Well, yes, but whatever the lyrics, the message of the music, and the subliminal message of the music videos, there is a lot more going on in Germany, a nation of more than 80 million people, than the antics of a single heavy metal band, however atavistic.

U.S. readers inclined to gloat over the woes of the old continent should keep in mind the author's observation, a conclusion I had come to long before I ever opened this book, that the U.S. is heading directly for the same confluence of catastrophes as Europe, and, absent a fundamental change of course, will simply arrive at the scene of the accident somewhat later; and that's only taking into account the problems they have in common; the European economy, unlike the American, is able to function without borrowing on the order of two billion dollars a day from China and Japan.

If you live in Europe, as I have for the last fifteen years (thankfully outside, although now encircled by, the would-be empire that sprouted from Brussels), you'll probably find little here that's new, but you may get a better sense of how the problems interact with one another to make a real crisis somewhere in the future a genuine possibility. The target audience in the U.S., which is so often lectured by their elite that Europe is so much more sophisticated, nuanced, socially and environmentally aware, and rational, may find this book an eye opener; 344,955 American soldiers perished in European wars in the last century, and while it may be satisfying to say, “To Hell with Europe!”, the lesson of history is that saying so is most unwise.

An Instapundit podcast interview with the author is freely available on-line.

 Permalink

Williamson, Donald I. The Origins of Larvae. Dordrecht, The Netherlands: Kluwer Academic, 2003. ISBN 1-4020-1514-3.
I am increasingly beginning to suspect that we are living through an era which, in retrospect, will be seen, like the early years of the twentieth century, as the final days preceding revolutions in a variety of scientific fields. Precision experiments and the opening of new channels of information about the universe as diverse as the sequencing of genomes, the imminent detection of gravitational waves, and detailed measurement of the cosmic background radiation are amassing more and more discrepant data which causes scientific journeymen to further complicate their already messy “standard models”, and the more imaginative among them to think that maybe there are simple, fundamental things which we're totally missing. Certainly, when the scientific consensus is that everything we see and know about comprises less than 5% of the universe, and a majority of the last generation of theorists in high energy physics have been working on a theory which only makes sense in a universe with ten, or maybe eleven, or maybe twenty-six dimensions, there would seem to be a lot of room for an Einstein-like conceptual leap which would make everybody slap their foreheads and exclaim, “How could we have missed that!

But still we have Darwin, don't we? If the stargazers and particle smashers are puzzled by what they see, certainly the more down-to-earth folk who look at creatures that inhabit our planet still stand on a firm foundation, don't they? Well…maybe not. Perhaps, as this book argues, not only is the conventional view of the “tree of life” deeply flawed, the very concept of a tree, where progenitor species always fork into descendants, but there is never any interaction between the ramified branches, is incorrect. (Just to clarify in advance: the author does not question the fundamental mechanism of Darwinian evolution by natural selection of inherited random variations, nor argue for some other explanation for the origin of the diversity in species on Earth. His argument is that this mechanism may not be the sole explanation for the characteristics of the many species with larval forms or discordant embryonic morphology, and that the assumption made by Darwin and his successors that evolution is a pure process of diversification [or forking of species from a common ancestor, as if companies only developed by spin-offs, and never did mergers and acquisitions] may be a simplification that, while it makes the taxonomist's job easier, is not warranted by the evidence.)

Many forms of life on Earth are not born from the egg as small versions of their adult form. Instead, they are born as larvae, which are often radically different in form from the adult. The best known example is moths and butterflies, which hatch as caterpillars, and subsequently reassemble themselves into the winged insects which mate and produce eggs that hatch into the next generation of caterpillars. Larvae are not restricted to arthropoda and other icky phyla: frogs and toads are born as tadpoles and live in one body form, then transform into quite different adults. Even species, humans included, which are born as little adults, go through intermediate stages as developing embryos which have the characteristics of other, quite different species.

Now, when you look closely at this, (and many will be deterred because a great deal of larvae and the species they mature into are rather dreadful), you'll find a long list of curious things which have puzzled naturalists all the way back to Darwin and before. There are numerous examples of species which closely resemble one another and are classified by taxonomists in the same genus which have larvae which are entirely different from one another—so much so that if the larvae were classified by themselves, they would probably be put into different classes or phyla. There are almost identical larvae which develop into species only distantly related. Closely related species include those with one or more larval forms, and others which develop directly: hatching as small individuals already with the adult form. And there are animals which, in their adult form, closely resemble the larvae of other species.

What a mess—but then biology is usually messy! The author, an expert on marine invertebrates (from which the vast majority of examples in this book are drawn), argues that there is a simple explanation for all of these discrepancies and anomalies, one which, if you aren't a biologist yourself, may have already occurred to you—that larvae (and embryonic forms) are the result of a hybridisation or merger of two unrelated species, with the result being a composite which hatches in one form and then subsequently transforms into the other. The principle of natural selection would continue to operate on these inter-specific mergers, of course: complicating or extending the development process of an animal before it could reproduce would probably be selected out, but, on the other hand, adding a free-floating or swimming larval form to an animal whose adult crawls on the ocean bottom or remains fixed to a given location like a clam or barnacle could confer a huge selective advantage on the hybrid, and equip it to ride out mass extinction events because the larval form permitted the species to spread to marginal habitats where it could survive the extinction event.

The acquisition of a larva by successful hybridisation could spread among the original species with no larval form not purely by differential selection but like a sexually transmitted disease—in other words, like wildfire. Note that many marine invertebrates reproduce simply by releasing their eggs and sperm into the sea and letting nature sort it out; consequently, the entire ocean is a kind of of promiscuous pan-specific singles bar where every pelagic and benthic creature is trying to mate, utterly indiscriminately, with every other at the whim of the wave and current. Most times, as in singles bars, it doesn't work out, but suppose sometimes it does?

You have to assume a lot of improbable things for this to make sense, the most difficult of which is that you can combine the sperm and egg of vastly different creatures and (on extremely rare occasions) end up with a hybrid which is born in the form of one and then, at some point, spontaneously transforms into the other. But ruling this out (or deciding it's plausible) requires understanding the “meta-program” of embryonic development—until we do, there's always the possibility we'll slap our foreheads when we realise how straightforward the mechanism is which makes this work.

One thing is clear: this is real science; the author makes unambiguous predictions about biology which can be tested in a variety of ways: laboratory experiments in hybridisation (on p. 213–214 he advises those interested in how to persuade various species to release their eggs and sperm), analysis of genomes (which ought to show evidence of hybridisation in the past), and detailed comparison of adult species which are possible progenitors of larval forms with larvae of those with which they may have hybridised.

If you're insufficiently immersed in the utter weirdness of life forms on this little sphere we inhabit, there is plenty here to astound you. Did you know, for example, about Owenia fusiformis (p. 72), which undergoes “cataclysmic metamorphosis”, which puts the chest-burster of Alien to shame: the larva develops an emerging juvenile worm which, in less than thirty seconds, turns itself inside-out and swallows the larva, which it devours in fifteen minutes. The larva does not “develop into” the juvenile, as is often said; it is like the first stage of a rocket which is discarded after it has done its job. How could this have evolved smoothly by small, continuous changes? For sheer brrrr factor, it's hard to beat the nemertean worms, which develop from tiny larvae into adults some of which exceed thirty metres in length (p. 87).

The author is an expert, and writes for his peers. There are many paragraphs like the following (p. 189), which will send you to the glossary at the end of the text (don't overlook it—otherwise you'll spend lots of time looking up things on the Web).

Adult mantis shrimp (Stomatapoda) live in burrows. The five anterior thoracic appendages are subchelate maxillipeds, and the abdomen bears pleopods and uropods. Some hatch as antizoeas: planktonic larvae that swim with five pairs of biramous thoracic appendages. These larvae gradually change into pseudozoeas, with subchelate maxillipeds and with four or five pairs of natatory pleopods. Other stomatopods hatch as pseudozoeas. There are no uropods in the larval stages. The lack of uropods and the form of the other appendages contrasts with the condition in decapod larvae. It seems improbable that stomatopod larvae could have evolved from ancestral forms corresponding to zoeas and megalopas, and I suggest that the Decapoda and the Stomatopoda acquired their larvae from different foreign sources.
In addition to the zoö-jargon, another deterrent to reading this book is the cost: a list price of USD 109, quoted at Amazon.com at this writing at USD 85, which is a lot of money for a 260 page monograph, however superbly produced and notwithstanding its small potential audience; so fascinating and potentially significant is the content that one would happily part with USD 15 to read a PDF, but at prices like this one's curiosity becomes constrained by the countervailing virtue of parsimony. Still, if Williamson is right, some of the fundamental assumptions underlying our understanding of life on Earth for the last century and a half may be dead wrong, and if his conjecture stands the test of experiment, we may have at hand an understanding of mysteries such as the Cambrian explosion of animal body forms and the apparent “punctuated equilibria” in the fossil record. There is a Nobel Prize here for somebody who confirms that this supposition is correct. Lynn Margulis, whose own theory of the origin of eukaryotic cells by the incorporation of previously free-living organisms as endosymbionts, which is now becoming the consensus view, co-authors a foreword which endorses Williamson's somewhat similar view of larvae.

 Permalink

Reasoner, James. Draw: The Greatest Gunfights of the American West. New York: Berkley, 2003. ISBN 0-425-19193-1.
The author is best known as a novelist, author of a bookshelf full of yarns, mostly set in the Wild West, but also of the War Between the States and World War II. In this, his first work of nonfiction after twenty-five years as a writer, he sketches in 31 short chapters (of less than ten pages average length, with a number including pictures) the careers and climactic (and often career-ending) conflicts of the best known gunslingers of the Old West, as well as many lesser-known figures, some of which were just as deadly and, in their own time, notorious. Here are tales of Wyatt Earp, Doc Holliday, the Dalton Gang, Bat Masterson, Bill Doolin, Pat Garrett, John Wesley Hardin, Billy the Kid, and Wild Bill Hickok; but also Jim Levy, the Jewish immigrant from Ireland who was considered by both Earp and Masterson to be one of the deadliest gunfighters in the West; Henry Starr, who robbed banks from the 1890s until his death in a shoot-out in 1921, pausing in mid-career to write, direct, and star in a silent movie about his exploits, A Debtor to the Law; and Ben Thompson, who Bat Masterson judged to be the fastest gun in the West, who was, at various times, an Indian fighter, Confederate cavalryman, mercenary for Emperor Maximilian of Mexico, gambler, gunfighter,…and chief of police of Austin, Texas. Many of the characters who figure here worked both sides of the law, in some cases concurrently.

The author does not succumb to the temptation to glamorise these mostly despicable figures, nor the tawdry circumstances in which so many met their ends. (Many, but not all: Bat Masterson survived a career as deputy sheriff in Dodge City, sheriff of Ford County, Kansas, Marshal of Trinidad, Colorado, and as itinerant gambler in the wildest towns of the West, to live the last twenty years of his life in New York City, working as sports editor and columnist for a Manhattan newspaper.) Reasoner does, however, attempt to spice up the narrative with frontier lingo (whether genuine or bogus, I know not): lawmen and “owlhoots” (outlaws) are forever slappin' leather, loosing or dodging hails of lead, getting thrown in the hoosegow, or seeking the comfort of the soiled doves who plied their trade above the saloons. This can become tedious if you read the book straight through; it's better enjoyed a chapter at a time spread out over an extended period. The chapters are completely independent of one other (although there are a few cross-references), and may be read in any order. In fact, they read like a collection of magazine columns, but there is no indication in the book they were ever previously published. There is a ten page bibliography citing sources for each chapter but no index—this is a substantial shortcoming since many of the chapter titles do not name the principals in the events they describe, and since the paths of the most famous gunfighters crossed frequently, their stories are spread over a number of chapters.

 Permalink

Lloyd, Seth. Programming the Universe. New York: Alfred A. Knopf, 2006. ISBN 1-4000-4092-2.
The author has devoted his professional career to exploring the deep connections between information processing and the quantum mechanical foundations of the universe. Although his doctorate is in physics, he is a professor of mechanical engineering at MIT, which I suppose makes him an honest to God quantum mechanic. A pioneer in the field of quantum computation, he suggested the first physically realisable quantum computational device, and is author of the landmark papers which evaluated the computational power of the “ultimate laptop”computer which, if its one kilogram of mass and one litre of volume crunched any faster, would collapse into a black hole; estimated the computational capacity of the entire visible universe; and explored how gravitation and spacetime could be emergent properties of a universal quantum computation.

In this book, he presents these concepts to a popular audience, beginning by explaining the fundamentals of quantum mechanics and the principles of quantum computation, before moving on to the argument that the universe as a whole is a universal quantum computer whose future cannot be predicted by any simulation less complicated than the universe as a whole, nor any faster than the future actually evolves (a concept reminiscent of Stephen Wolfram's argument in A New Kind of Science [August 2002], but phrased in quantum mechanical rather than classical terms). He argues that all of the complexity we observe in the universe is the result of the universe performing a computation whose input is the random fluctuations created by quantum mechanics. But, unlike the proverbial monkeys banging on typewriters, the quantum mechanical primate fingers are, in effect, typing on the keys of a quantum computer which, like the cellular automata of Wolfram's book, has the capacity to generate extremely complex structures from very simple inputs. Why was the universe so simple shortly after the big bang? Because it hadn't had the time to compute very much structure. Why is the universe so complicated today? Because it's had sufficient time to perform 10122 logical operations up to the present.

I found this book, on the whole, a disappointment. Having read the technical papers cited above before opening it, I didn't expect to learn any additional details from a popularisation, but I did hope the author would provide a sense for how the field evolved and get a sense of where he saw this research programme going in the future and how it might (or might not) fit with other approaches to the unification of quantum mechanics and gravitation. There are some interesting anecdotes about the discovery of the links between quantum mechanics, thermodynamics, statistical mechanics, and information theory, and the personalities involved in that work, but one leaves the book without any sense for where future research might be going, nor how these theories might be tested by experiment in the near or even distant future. The level of the intended audience is difficult to discern. Unlike some popularisers of science, Lloyd does not shrink from using equations where they clarify physical relationships and even introduces and uses Dirac's “bra-ket” notation (for example, <φ|ψ>), yet almost everywhere he writes a number in scientific notation, he also gives it in the utterly meaningless form of (p. 165) “100 billion billion billion billion billion billion billion billion billion billion” (OK, I've done that myself, on one occasion, but I was having fun at the expense of a competitor). And finally, I find it dismaying that a popular science book by a prominent researcher published by a house as respectable as Knopf at a cover price of USD26 lacks an index—this is a fundamental added value that the reader deserves when parting with this much money (especially for a book of only 220 pages). If you know nothing about these topics, this volume will probably leave you only more confused, and possibly over-optimistic about the state of quantum computation. If you've followed the field reasonably closely, the author's professional publications (most available on-line), which are lucidly written and accessible to the non-specialist, may be more rewarding.

I remain dubious about grandiose claims for quantum computation, and nothing in this book dispelled my scepticism. From Democritus all the way to the present day, every single scientific theory which assumed the existence of a continuum has been proved wrong when experiments looked more closely at what was really going on. Yet quantum mechanics, albeit a statistical theory at the level of measurement, is completely deterministic and linear in the evolution of the wave function, with amplitudes given by continuous complex values which embody, theoretically, an infinite amount of information. Where is all this information stored? The Bekenstein bound gives an upper limit on the amount of information which can be represented in a given volume of spacetime, and that implies that even if the quantum state were stored nonlocally in the entire causally connected universe, the amount of information would be (albeit enormous), still finite. Extreme claims for quantum computation assume you can linearly superpose any number of wave functions and thus encode as much information as you like in a single computation. The entire history of science, and of quantum mechanics itself makes me doubt that this is so—I'll bet that we eventually find some inherent granularity in the precision of the wave function (perhaps round-off errors in the simulation we're living within, but let's not revisit that). This is not to say, nor do I mean to imply, that quantum computation will not work; indeed, it has already been demonstrated in proof of concept laboratory experiments, and it may well hold the potential of extending the growth of computational power after the pure scaling of classical computers runs into physical limits. But just as shrinking semiconductor devices is fundamentally constrained by the size of atoms, quantum computation may be limited by the ultimate precision of the discrete computational substrate of the universe which behaves, on the large scale, like a continuous wave function.

 Permalink

Ponnuru, Ramesh. The Party of Death. Washington: Regnery Publishing, 2006. ISBN 1-59698-004-4.
One party government is not a pretty thing. Just as competition in the marketplace reins in the excesses of would-be commercial predators (while monopoly encourages them to do their worst), long-term political dominance by a single party inevitably leads to corruption, disconnection of the ruling elites from their constituents, and unsustainable policy decisions which are destructive in the long term; this is precisely what has eventually precipitated the collapse of most empires. In recent years the federal government of the United States has been dominated by the Republican party, with all three branches of government and both houses of the congress in Republican hands. Chapter 18 of this fact-packed book cites a statistic which provides a stunning insight into an often-overlooked aspect of the decline of the Democratic party. In 1978, Democrats held 292 seats in the House of Representatives: an overwhelming super-majority of more than two thirds. Of these Democrats, 125, more than 40%, were identified as “pro-life”—opposed to abortion on demand and federal funding of abortion. But by 2004, only 35 Democrats in the House were identified as pro-life: fewer than 18%, and the total number of Democrats had shrunk to only 203, a minority of less than 47%. It is striking to observe that over a period of 26 years the number of pro-life Democrats has dropped by 90, almost identical to the party's total loss of 89 seats.

Now, the Democratic decline is more complicated than any single issue, but as the author documents, the Democratic activist base and large financial contributors are far more radical on issues of human life: unrestricted and subsidised abortion, euthanasia and assisted suicide, stem cell research which destroys human embryos, and human cloning for therapeutic purposes, than the American public at large. (The often deceptive questions used to manipulate the results of public opinion polls and the way they are spun in the overwhelmingly pro-abortion legacy media are discussed at length.) The activists and moneybags make the Democratic party a hostile environment for pro-life politicians and has, over the decades, selected them out, applying an often explicit litmus test to potential candidates, who are not allowed to deviate from absolutist positions. Their adherence to views not shared by most voters then makes them vulnerable in the general election.

Apart from the political consequences, the author examines the curious flirtation of the American left with death in all its forms—a strange alliance for a political philosophy which traditionally stressed protecting the weak and vulnerable: in the words of Hubert Humphrey (who was pro-life), “those who are in the dawn of life, the children; those who are in the twilight of life, the elderly; and those who are in the shadows of life, the sick, the needy, and the handicapped” (p. 131).

The author argues against the panoply of pro-death policies exclusively from a human rights standpoint. Religion is not mentioned except to refute the claim that pro-life policies are an attempt to impose a sectarian agenda on a secular society. The human rights argument could not be simpler to grasp: if you believe that human beings have inherent, unalienable rights, simply by being human, then what human right could conceivably be more fundamental than the right not to be killed. If one accepts this (and the paucity of explicitly pro-murder voters would seem to indicate the view is broadly shared), then the only way one can embrace policies which permit the destruction of a living human organism is to define criteria which distinguish a “person” who cannot be killed, from those who are not persons and therefore can. Thus one hears the human embryo or fetus (which has the potential of developing into an adult human) described as a “potential human”, and medical patients in a persistent vegetative state as having no personhood. Professor Peter Singer, bioethicist at the Center for Human Values at Princeton University argues (p. 176), “[T]he concept of a person is distinct from that of a member of the species Homo sapiens, and that it is personhood, not species membership, that is most significant in determining when it is wrong to end a life.”

But the problem with drawing lines that divide unarguably living human beings into classes of persons and nonpersons is that the distinctions are rarely clear-cut. If a fetus in the first three months of pregnancy is a nonperson, then what changes on the first day of the fourth month to confer personhood on the continuously developing baby? Why not five months, or six? And if a woman in the U.S. has a constitutionally protected right to have her child killed right up until the very last part of its body emerges from the birth canal (as is, in fact, the regime in effect today in the United States, notwithstanding media dissimulation of this reality), then what's so different about killing a newborn baby if, for example, it was found to have a birth defect which was not detected in utero. Professor Singer has no problem with this at all; he enumerates a variety of prerequisites for personhood: “rationality, autonomy, and self-consciousness”, and then concludes “Infants lack these characteristics. Killing them, therefore, cannot be equated with killing normal human beings, or any other self-conscious beings.”

It's tempting to dismiss Singer as another of the many intellectual Looney Tunes which decorate the American academy, but Ponnuru defends him for having the intellectual integrity to follow the premises he shares with many absolutists on these issues all the way to their logical conclusions, which lead Singer to conclude (p. 186), “[d]uring the next 35 years, the traditional view of the sanctity of human life will collapse…. By 2040, it may be that only a rump of hard-core, know-nothing religious fundamentalists will defend the view that every human life, from conception to death, is sacrosanct.” Doesn't that sound like a wonderful world, especially for those of us who expect to live out our declining years as that brave new era dawns, at least for those suitably qualified “persons” permitted to live long enough to get there?

Many contend that such worries are simply “the old slippery slope argument”, thinking that settles the matter. But the problem is that the old slippery slope argument is often right, and in this case there is substantial evidence that it very much applies. The enlightened Dutch seem to have slid further and faster than others in the West, permitting both assisted suicide for the ill and euthanasia for seriously handicapped infants at the parents' request—in theory. In fact, it is estimated that five percent of of all deaths in The Netherlands are the result of euthanasia by doctors without request (which is nominally illegal), and that five percent of infanticide occurs without the request or consent of the parents, and it is seldom noted in the media that the guidelines which permit these “infanticides” actually apply to children up to the age of twelve. Perhaps that's why the Dutch are so polite—young hellions run the risk not only of a paddling but also of “post-natal abortion”. The literally murderous combination of an aging population supported by a shrinking number of working-age people, state-sanctioned euthanasia, and socialised medicine is fearful to contemplate.

These are difficult issues, and the political arena has become so polarised into camps of extremists on both sides that rational discussion and compromise seem almost impossible. This book, while taking a pro-life perspective, eschews rhetoric in favour of rational argumentation grounded in the principles of human rights which date to the Enlightenment. One advantage of applying human rights to all humans is that it's simple and easy to understand. History is rich in examples which show that once a society starts sorting people into persons and nonpersons, things generally start to go South pretty rapidly. Like it or not, these are issues which modern society is going to have to face: advances in medical technologies create situations that call for judgements people never had to make before. For those who haven't adopted one extreme position or another, and are inclined to let the messy democratic process of decision making sort this out, ideally leaving as much discretion as possible to the individuals involved, as opposed to absolutist “rights” discovered in constitutional law and imposed by judicial diktat, this unsettling book is a valuable contribution to the debate. Democratic party stalwarts are unlikely in the extreme to read it, but they ignore this message at their peril.

The book is not very well-edited. There are a number of typographical errors and on two occasions (pp.  94 and 145), the author's interpolations in the middle of extended quotations are set as if they were part of the quotation. It is well documented; there are thirty-four pages of source citations.

 Permalink

August 2006

Sullivan, Robert. Rats. New York: Bloomsbury, [2004] 2005. ISBN 1-58234-477-9.
Here we have one of the rarest phenomena in publishing: a thoroughly delightful best-seller about a totally disgusting topic: rats. (Before legions of rat fanciers write to berate me for bad-mouthing their pets, let me state at the outset that this book is about wild rats, not pet and laboratory rats which have been bred for docility for a century and a half. The new afterword to this paperback edition relates the story of a Brooklyn couple who caught a juvenile Bedford-Stuyvesant street rat to fill the empty cage of their recently deceased pet and, as it it matured, came to regard it with such fear that they were afraid even to release it in a park lest it turn and attack them when the cage was opened—the author suggested they might consider the strategy of “open the cage and run like hell” [p. 225–226]. One of the pioneers in the use of rats in medical research in the early years of the 20th century tried to use wild rats and concluded “they proved too savage to maintain in the laboratory” [p. 231].)

In these pages are more than enough gritty rat facts to get yourself ejected from any polite company should you introduce them into a conversation. Many misconceptions about rats are debunked, including the oft-cited estimate that the rat and human population is about the same, which would lead to an estimate of about eight million rats in New York City—in fact, the most authoritative estimate (p. 20) puts the number at about 250,000 which is still a lot of rats, especially once you begin to appreciate what a single rat can do. (But rat exaggeration gets folks' attention: here is a politician claiming there are fifty-six million rats in New York!) “Rat stories are war stories” (p. 34), and this book teems with them, including The Rat that Came Up the Toilet, which is not an urban legend but a well-documented urban nightmare. (I'd be willing to bet that the incidence of people keeping the toilet lid closed with a brick on the top is significantly greater among readers of this book.)

It's common for naturalists who study an animal to develop sympathy for it and defend it against popular aversion: snakes and spiders, for example, have many apologists. But not rats: the author sums up by stating that he finds them “disgusting”, and he isn't alone. The great naturalist and wildlife artist John James Audubon, one of the rare painters ever to depict rats, amused himself during the last years of his life in New York City by prowling the waterfront hunting rats, having received permission from the mayor “to shoot Rats in the Battery” (p. 4).

If you want to really get to know an animal species, you have to immerse yourself in its natural habitat, and for the Brooklyn-based author, this involved no more than a subway ride to Edens Alley in downtown Manhattan, just a few blocks from the site of the World Trade Center, which was destroyed during the year he spent observing rats there. Along with rat stories and observations, he sketches the history of New York City from a ratty perspective, with tales of the arrival of the brown rat (possibly on ships carrying Hessian mercenaries to fight for the British during the War of American Independence), the rise and fall of rat fighting as popular entertainment in the city, the great garbage strike of 1968 which transformed the city into something close to heaven if you happened to be a rat, and the 1964 Harlem rent strike in which rats were presented to politicians by the strikers to acquaint them with the living conditions in their tenements.

People involved with rats tend to be outliers on the scale of human oddness, and the reader meets a variety of memorable characters, present-day and historical: rat fight impresarios, celebrity exterminators, Queen Victoria's rat-catcher, and many more. Among numerous fascinating items in this rat fact packed narrative is just how recent the arrival of the mis-named brown rat, Rattus norvegicus, is. (The species was named in England in 1769, having been believed to have stowed away on ships carrying lumber from Norway. In fact, it appears to have arrived in Britain before it reached Norway.) There were no brown rats in Europe at all until the 18th century (the rats which caused the Black Death were Rattus rattus, the black rat, which followed Crusaders returning from the Holy Land). First arriving in America around the time of the Revolution, the brown rat took until 1926 to spread to every state in the United States, displacing the black rat except for some remaining in the South and West. The Canadian province of Alberta remains essentially rat-free to this day, thanks to a vigorous and vigilant rat control programme.

The number of rats in an area depends almost entirely upon the food supply available to them. A single breeding pair of rats, with an unlimited food supply and no predation or other causes of mortality, can produce on the order of fifteen thousand descendants in a single year. That makes it pretty clear that a rat population will grow until all available food is being consumed by rats (and that natural selection will favour the most aggressive individuals in a food-constrained environment). Poison or trapping can knock down the rat population in the case of a severe infestation, but without limiting the availability of food, will produce only a temporary reduction in their numbers (while driving evolution to select for rats which are immune to the poison and/or more wary of the bait stations and traps).

Given this fact, which is completely noncontroversial among pest control professionals, it is startling that in New York City, which frets over and regulates public health threats like second-hand tobacco smoke while its denizens suffer more than 150 rat bites a year, many to children, smoke-free restaurants dump their offal into rat-infested alleys in thin plastic garbage bags, which are instantly penetrated by rats. How much could it cost to mandate, or even provide, rat-proof steel containers for organic waste, compared to the budget for rodent control and the damages and health hazards of a large rat population? Rats will always be around—in 1936, the president of the professional society for exterminators persuaded the organisation to change the name of the occupation from “exterminator” to “pest control operator”, not because the word “exterminator” was distasteful, but because he felt it over-promised what could actually be achieved for the client (p. 98). But why not take some simple, obvious steps to constrain the rat population?

The book contains more than twenty pages of notes in narrative form, which contain a great deal of additional information you don't want to miss, including the origin of giant inflatable rats for labour rallies, and even a poem by exterminator guru Bobby Corrigan. There is no index.

 Permalink

Staley, Kent W. The Evidence for the Top Quark. Cambridge: Cambridge University Press, 2004. ISBN 0-521-82710-8.
A great deal of nonsense and intellectual nihilism has been committed in the name of “science studies”. Here, however, is an exemplary volume which shows not only how the process of scientific investigation should be studied, but also why. The work is based on the author's dissertation in philosophy, which explored the process leading to the September 1994 publication of the “Evidence for top quark production in pp collisions at √s = 1.8 TeV” paper in Physical Review D. This paper is a quintessential example of Big Science: more than four hundred authors, sixty pages of intricate argumentation from data produced by a detector weighing more than two thousand tons, and automated examination of millions and millions of collisions between protons and antiprotons accelerated to almost the speed of light by the Tevatron, all to search, over a period of months, for an elementary particle which cannot be observed in isolation, and finally reporting “evidence” for its existence (but not “discovery” or “observation”) based on a total of just twelve events “tagged” by three different algorithms, when a total of about 5.7 events would have been expected due to other causes (“background”) purely by chance alone.

Through extensive scrutiny of contemporary documents and interviews with participants in the collaboration which performed the experiment, the author provides a superb insight into how science on this scale is done, and the process by which the various kinds of expertise distributed throughout a large collaboration come together to arrive at the consensus they have found something worthy of publication. He explores the controversies about the paper both within the collaboration and subsequent to its publication, and evaluates claims that choices made by the experimenters may have a produced a bias in the results, and/or that choosing experimental “cuts” after having seen data from the detector might constitute “tuning on the signal”: physicist-speak for choosing the criteria for experimental success after having seen the results from the experiment, a violation of the “predesignation” principle usually assumed in statistical tests.

In the final two, more philosophical, chapters, the author introduces the concept of “Error-Statistical Evidence”, and evaluates the analysis in the “Evidence” paper in those terms, concluding that despite all the doubt and controversy, the decision making process was, in the end, ultimately objective. (And, of course, subsequent experimentation has shown the information reported in the Evidence paper to be have been essentially correct.)

Popular accounts of high energy physics sometimes gloss over the fantastically complicated and messy observations which go into a reported result to such an extent you might think experimenters are just waiting around looking at a screen waiting for a little ball to pop out with a “t” or whatever stencilled on the side. This book reveals the subtlety of the actual data from these experiments, and the intricate chain of reasoning from the multitudinous electronic signals issuing from a particle detector to the claim of having discovered a new particle. This is not, however, remotely a work of popularisation. While attempting to make the physics accessible to philosophers of science and the philosophy comprehensible to physicists, each will find the portions outside their own speciality tough going. A reader without a basic understanding of the standard model of particle physics and the principles of statistical hypothesis testing will probably end up bewildered and may not make it to the end, but those who do will be rewarded with a detailed understanding of high energy particle physics experiments and the operation of large collaborations of researchers which is difficult to obtain anywhere else.

 Permalink

Wilczek, Frank. Fantastic Realities. Singapore: World Scientific, 2006. ISBN 981-256-655-4.
The author won the 2004 Nobel Prize in Physics for his discovery of “asymptotic freedom” in the strong interaction of quarks and gluons, which laid the foundation of the modern theory of Quantum Chromodynamics (QCD) and the Standard Model of particle physics. This book is an anthology of his writing for general and non-specialist scientific audiences over the last fifteen years, including eighteen of his “Reference Frame” columns from Physics Today and his Nobel prize autobiography and lecture.

I had eagerly anticipated reading this book. Frank Wilczek and his wife Betsy Devine are co-authors of the 1988 volume Longing for the Harmonies, which I consider to be one of the best works of science popularisation ever written, and whose “theme and variation” structure I adopted for my contemporary paper “The New Technological Corporation”. Wilczek is not only a brilliant theoretician, he has a tremendous talent for explaining the arcana of quantum mechanics and particle physics in lucid prose accessible to the intelligent layman, and his command of the English language transcends pedestrian science writing and sometimes verges on the poetic, occasionally crossing the line: this book contains six original poems!

The collection includes five book reviews, in a section titled “Inspired, Irritated, Inspired”, the author's reaction to the craft of reviewing books, which he describes as “like going on a blind date to play Russian roulette” (p. 305). After finishing this 500 page book, I must sadly report that my own experience can be summed up as “Inspired, Irritated, Exasperated”. There is inspiration aplenty and genius on display here, but you're left with the impression that this is a quickie book assembled by throwing together all the popular writing of a Nobel laureate and rushed out the door to exploit his newfound celebrity. This is not something you would expect of World Scientific, but the content of the book argues otherwise.

Frank Wilczek writes frequently for a variety of audiences on topics central to his work: the running of the couplings in the Standard Model, low energy supersymmetry and the unification of forces, a possible SO(10) grand unification of fundamental particles, and lattice QCD simulation of the mass spectrum of mesons and hadrons. These are all fascinating topics, and Wilczek does them justice here. The problem is that with all of these various articles collected in one book, he does them justice again, again, and again. Four illustrations: the lattice QCD mass spectrum, the experimentally measured running of the strong interaction coupling, the SO(10) particle unification chart, and the unification of forces with and without supersymmetry, appear and are discussed three separate times (the latter four times) in the text; this gets tedious.

There is sufficient wonderful stuff in this book to justify reading it, but don't feel duty-bound to slog through the nth repetition of the same material; a diligent editor could easily cut at least a third of the book, and probably close to half without losing any content. The final 70 pages are excerpts from Betsy Devine's Web log recounting the adventures which began with that early morning call from Sweden. The narrative is marred by the occasional snarky political comment which, while appropriate in a faculty wife's blog, is out of place in an anthology of the work of a Nobel laureate who scrupulously avoids mixing science and politics, but still provides an excellent inside view of just what it's like to win and receive a Nobel prize.

 Permalink

Scalzi, John. The Ghost Brigades. New York: Tor, 2006. ISBN 0-7653-1502-5.
After his stunning fiction debut in Old Man's War (April 2005), readers hoping for the arrival on the scene of a new writer of Golden Age stature held their breath to see whether the author would be a one book wonder or be able to repeat. You can start breathing again—in this, his second novel, he hits another one out of the ballpark.

This story is set in the conflict-ridden Colonial Union universe of Old Man's War, some time after the events of that book. Although in the acknowledgements he refers to this as a sequel, you'd miss little or nothing by reading it first, as everything introduced in the first novel is explained as it appears here. Still, if you have the choice, it's best to read them in order. The Colonial Special Forces, which are a shadowy peripheral presence in Old Man's War, take centre stage here. Special Forces are biologically engineered and enhanced super-soldiers, bred from the DNA of volunteers who enlisted in the regular Colonial Defense Forces but died before they reached the age of 75 to begin their new life as warriors. Unlike regular CDF troops, who retain their memories and personalities after exchanging their aged frame for a youthful and super-human body, Special Forces start out as a tabula rasa with adult bodies and empty brains ready to be programmed by their “BrainPal” appliance, which also gives them telepathic powers.

The protagonist, Jared Dirac, is a very special member of the Special Forces, as he was bred from the DNA of a traitor to the Colonial Union, and imprinted with that person's consciousness in an attempt to figure out his motivations and plans. Things didn't go as expected, and Jared ends up with two people in his skull, leading to exploration of the meaning of human identity and how our memories (or those of others) make us who we are, along the lines of Robert Heinlein's I Will Fear No Evil. The latter was not one of Heinlein's better outings, but Scalzi takes the nugget of the idea and runs with it here, spinning a yarn that reads like Heinlein's better work. In the last fifty pages, the Colonial Union universe becomes a lot more ambiguous and interesting, and the ground is laid for a rich future history series set there. This book has less rock-em sock-em combat and more character development and ideas, which is just fine for this non-member of the video game generation.

Since almost anything more I said would constitute a spoiler, I'll leave it at that; I loved this book, and if you enjoy the best of Heinlein, you probably will as well. (One quibble, which I'll try to phrase to avoid being a spoiler: for the life of me, I can't figure out how Sagan expects to open the capture pod at the start of chapter 14 (p. 281), when on p. 240 she couldn't open it, and since then nothing has happened to change the situation.) For more background on the book and the author's plans for this universe, check out the Instapundit podcast interview with the author.

 Permalink

September 2006

Howard, Michael, David LeBlanc, and John Viega. 19 Deadly Sins of Software Security. Emeryville, CA: Osborne, 2005. ISBN 0-07-226085-8.
During his brief tenure as director of the National Cyber Security Division of the U.S. Department of Homeland Security, Amit Yoran (who wrote the foreword to this book) got a lot of press attention when he claimed, “Ninety-five percent of software bugs are caused by the same 19 programming flaws.” The list of these 19 dastardly defects was assembled by John Viega who, with his two co-authors, both of whom worked on computer security at Microsoft, attempt to exploit its notoriety in this poorly written, jargon-filled, and utterly worthless volume. Of course, I suppose that's what one should expect when a former official of the agency of geniuses who humiliate millions of U.S. citizens every day to protect them from the peril of grandmothers with exploding sneakers team up with a list of authors that includes a former “security architect for Microsoft's Office division”—why does the phrase “macro virus” immediately come to mind?

Even after reading this entire ramble on the painfully obvious, I cannot remotely guess who the intended audience was supposed to be. Software developers who know enough to decode what the acronym-packed (many never or poorly defined) text is trying to say are already aware of the elementary vulnerabilities being discussed and ways to mitigate them. Those without knowledge of competent programming practice are unlikely to figure out what the authors are saying, since their explanations in most cases assume the reader is already aware of the problem. The book is also short (281 pages), generous with white space, and packed with filler: the essential message of what to look out for in code can be summarised in a half-page table: in fact, it has been, on page 262! Not only does every chapter end with a summary of “do” and “don't” recommendations, all of these lists are duplicated in a ten page appendix at the end, presumably added because the original manuscript was too short. Other obvious padding is giving examples of trivial code in a long list of languages (including proprietary trash such as C#, Visual Basic, and the .NET API); around half of the code samples are Microsoft-specific, as are the “Other Resources” at the end of each chapter. My favourite example is on pp. 176–178, which gives sample code showing how to read a password from a file (instead of idiotically embedding it in an application) in four different programming languages: three of them Microsoft-specific.

Like many bad computer books, this one seems to assume that programmers can learn only from long enumerations of specific items, as opposed to a theoretical understanding of the common cause which underlies them all. In fact, a total of eight chapters on supposedly different “deadly sins” can be summed up in the following admonition, “never blindly trust any data that comes from outside your complete control”. I had learned this both from my elders and brutal experience in operating system debugging well before my twentieth birthday. Apart from the lack of content and ill-defined audience, the authors write in a dialect of jargon and abbreviations which is probably how morons who work for Microsoft speak to one another: “app”, “libcall”, “proc”, “big-honking”, “admin”, “id” litter the text, and the authors seem to believe the word for a security violation is spelt “breech”. It's rare that I read a technical book in any field from which I learn not a single thing, but that's the case here. Well, I suppose I did learn that a prominent publisher and forty dollar cover price are no guarantee the content of a book will be of any value. Save your money—if you're curious about which 19 “sins” were chosen, just visit the Amazon link above and display the back cover of the book, which contains the complete list.

 Permalink

Mayer, Milton. They Thought They Were Free. 2nd. ed. Chicago: University of Chicago Press, [1955] 1966. ISBN 0-226-51192-8.
The author, a journalist descended from German Jewish immigrants to the United States, first visited Nazi Germany in 1935, spending a month in Berlin attempting to obtain, unsuccessfully, an interview with Hitler, notwithstanding the assistance of his friend, the U.S. ambassador, then travelled through the country reporting for a U.S. magazine. It was then that he first discovered, meeting with ordinary Germans, that Nazism was not, as many perceived it then and now, “the tyranny of a diabolical few over helpless millions” (p. xviii), but rather a mass movement grounded in the “little people” with a broad base of non-fanatic supporters.

Ten years after the end of the war, Mayer arranged a one year appointment as a visiting professor at the University of Frankfurt and moved, with his family, to a nearby town of about 20,000 he calls “Kronenberg”. There, he spent much of his time cultivating the friendship of ten men he calls “my ten Nazi friends”, all of whom joined the party for various reasons ranging from ideology, assistance in finding or keeping employment, to admiration of what they saw as Hitler's success (before the war) in restoring the German economy and position in the world. A large part of the book is reconstructed conversations with these people, exploring the motivations of those who supported Hitler (many of whom continued, a decade after Germany's disastrous defeat in the war he started, to believe the years of his rule prior to the war were Germany's golden age). Together they provide a compelling picture of life in a totalitarian society as perceived by people who liked it.

This is simultaneously a profoundly enlightening and disturbing book. The author's Nazi friends come across as almost completely unexceptional, and one comes to understand how the choices they made, rooted in the situation they found themselves, made perfect sense to them. And then, one cannot help but ask, “What would I have done in the same circumstances?” Mayer has no truck with what has come to be called multiculturalism—he is a firm believer in national character (although, of course, only on the average, with large individual variation), and he explains how history, over almost two millennia, has forged the German character and why it is unlikely to be changed by military defeat and a few years of occupation.

Apart from the historical insights, this book is highly topical when a global superpower is occupying a very different country, with a tradition and history far more remote from its own than was Germany's, and trying to instill institutions with no historical roots there. People forget, but ten years after the end of World War II many, Mayer included, considered the occupation of Germany to have been a failure. He writes (p. 303):

The failure of the Occupation could not, perhaps, have been averted in the very nature of the case. But it might have been mitigated. Its mitigation would have required the conquerors to do something they had never had to do in their history. They would have had to stop doing what they were doing and ask themselves some questions, hard questions, like, What is the German character? How did it get that way? What is wrong with its being that way? What way would be better, and what, if anything, could anybody do about it?
Wise questions, indeed, for any conqueror of any country.

The writing is so superb that you may find yourself re-reading paragraphs just to savour how they're constructed. It is also thought-provoking to ponder how many things, from the perspective of half a century later, the author got wrong. In his view the occupation of West Germany would fail to permanently implant democracy, that German re-militarisation and eventual aggression was almost certain unless blocked by force, and that the project of European unification was a pipe dream of idealists and doomed to failure. And yet, today, things seem to have turned out pretty well for Germany, the Germans, and their neighbours. The lesson of this may be that national character can be changed, but changing it is the work of generations, not a few years of military occupation. That is also something modern-day conquerors, especially Western societies with a short attention span, might want to bear in mind.

 Permalink

Smolin, Lee. The Trouble with Physics. New York: Houghton Mifflin, 2006. ISBN 0-618-55105-0.
The first forty years of the twentieth century saw a revolution in fundamental physics: special and general relativity changed our perception of space, time, matter, energy, and gravitation; quantum theory explained all of chemistry while wiping away the clockwork determinism of classical mechanics and replacing it with a deeply mysterious theory which yields fantastically precise predictions yet nobody really understands at its deepest levels; and the structure of the atom was elucidated, along with important clues to the mysteries of the nucleus. In the large, the universe was found to be enormously larger than expected and expanding—a dynamic arena which some suspected might have an origin and a future vastly different than its present state.

The next forty years worked out the structure and interactions of the particles and forces which constitute matter and govern its interactions, resulting in a standard model of particle physics with precisely defined theories which predicted all of the myriad phenomena observed in particle accelerators and in the highest energy events in the heavens. The universe was found to have originated in a big bang no more distant than three times the age of the Earth, and the birth cry of the universe had been detected by radio telescopes.

And then? Unexpected by almost all practitioners of high energy particle physics, which had become an enterprise larger by far than all of science at the start of the century, progress stopped. Since the wrapping up of the standard model around 1975, experiments have simply confirmed its predictions (with the exception of the discovery of neutrino oscillations and consequent mass, but that can be accommodated within the standard model without changing its structure), and no theoretical prediction of phenomena beyond the standard model has been confirmed experimentally.

What went wrong? Well, we certainly haven't reached the End of Science or even the End of Physics, because the theories which govern phenomena in the very small and very large—quantum mechanics and general relativity—are fundamentally incompatible with one another and produce nonsensical or infinite results when you attempt to perform calculations in the domain—known to exist from astronomical observations—where both must apply. Even a calculation as seemingly straightforward as estimating the energy of empty space yields a result which is 120 orders of magnitude greater than experiment shows it to be: perhaps the most embarrassing prediction in the history of science.

In the first chapter of this tour de force, physicist Lee Smolin poses “The Five Great Problems in Theoretical Physics”, all of which are just as mysterious today as they were thirty-five years ago. Subsequent chapters explore the origin and nature of these problems, and how it came to be, despite unprecedented levels of funding for theoretical and experimental physics, that we seem to be getting nowhere in resolving any of these fundamental enigmas.

This prolonged dry spell in high energy physics has seen the emergence of string theory (or superstring theory, or M-theory, or whatever they're calling it this year) as the dominant research program in fundamental physics. At the outset, there were a number of excellent reasons to believe that string theory pointed the way to a grand unification of all of the forces and particles of physics, and might answer many, if not all, of the Great Problems. This motivated many very bright people, including the author (who, although most identified with loop quantum gravity research, has published in string theory as well) to pursue this direction. What is difficult for an outsider to comprehend, however, is how a theoretical program which, after thirty-five years of intensive effort, has yet to make a single prediction testable by a plausible experiment; has failed to predict any of the major scientific surprises that have occurred over those years such as the accelerating expansion of the universe and the apparent variation in the fine structure constant; that does not even now exist in a well-defined mathematical form; and has not been rigorously proved to be a finite theory; has established itself as a virtual intellectual monopoly in the academy, forcing aspiring young theorists to work in string theory if they are to have any hope of finding a job, receiving grants, or obtaining tenure.

It is this phenomenon, not string theory itself, which, in the author's opinion, is the real “Trouble with Physics”. He considers string theory as quite possibly providing clues (though not the complete solution) to the great problems, and finds much to admire in many practitioners of this research. But monoculture is as damaging in academia as in agriculture, and when it becomes deeply entrenched in research institutions, squeezes out other approaches of equal or greater merit. He draws the distinction between “craftspeople”, who are good at performing calculations, filling in blanks, and extending an existing framework, and “seers”, who make the great intellectual leaps which create entirely new frameworks. After thirty-five years with no testable result, there are plenty of reasons to suspect a new framework is needed, yet our institutions select out those most likely to discover them, or force them to spend their most intellectually creative years doing tedious string theory calculations at the behest of their elders.

In the final chapters, Smolin looks at how academic science actually works today: how hiring and tenure decisions are made, how grant applications are evaluated, and the difficult career choices young physicists must make to work within this system. When reading this, the word “Gosplan” (Госпла́н) kept flashing through my mind, for the process he describes resembles nothing so much as central planning in a command economy: a small group of senior people, distant from the facts on the ground and the cutting edge of intellectual progress, trying to direct a grand effort in the interest of “efficiency”. But the lesson of more than a century of failed socialist experiments is that, in the timeless words of Rocket J. Squirrel, “that trick never works”—the decisions inevitably come down on the side of risk aversion, and are often influenced by cronyism and toadying to figures in authority. The concept of managing risk and reward by building a diversified portfolio of low and high risk placements which is second nature to managers of venture capital funds and industrial research and development laboratories appears to be totally absent in academic science, which is supposed to be working on the most difficult and fundamental questions. Central planning works abysmally for cement and steel manufacturing; how likely is it to spark the next scientific revolution?

There is much more to ponder: why string theory, as presently defined, cannot possibly be a complete theory which subsumes general relativity; hints from experiments which point to new physics beyond string theory; stories of other mathematically beautiful theories (such as SU(5) grand unification) which experiment showed to be dead wrong; and a candid view of the troubling groupthink, appeal to authority, and intellectual arrogance of some members of the string theory community. As with all of Smolin's writing, this is a joy to read, and you get the sense that he's telling you the straight story, as honestly as he can, not trying to sell you something. If you're interested in these issues, you'll probably also want to read Leonard Susskind's pro-string The Cosmic Landscape (March 2006) and Peter Woit's sceptical Not Even Wrong (June 2006).

 Permalink

Wells, H. G. Little Wars. Springfield, VA: Skirmisher, [1913] 2004. ISBN 0-9722511-5-4.
I have been looking for a copy of this book for more than twenty-five years. In this 1913 classic, H. G. Wells essentially single-handedly invented the modern pastime of miniature wargaming, providing a (tin soldier) battle-tested set of rules which makes for exciting, well-balanced, and unpredictable games which can be played by two or more people in an afternoon and part of an evening. Interestingly, he avoids much of the baggage that burdens contemporary games such as icosahedral dice and indirect fire calculations, and strictly minimises the rôle of chance, using nothing fancier than a coin toss, and that only in rare circumstances.

The original edition couldn't have appeared at a less auspicious time: published just a year before the outbreak of the horrific Great War (a term Wells uses, prophetically, to speak of actual military conflict in this book). The work is, of course, long out of copyright and text editions are available on the Internet, including this one at Project Gutenberg, but they are unsatisfying because the text makes frequent reference to the nineteen photographs by Wells's second wife, Amy Catherine Wells, which are not included in the on-line editions but reproduced in this volume. Even if you aren't interested in the details, just seeing grown men in suits scrunching down on the ground playing with toy soldiers is worth the price of admission. The original edition included almost 150 delightful humorous line drawings by J. R. Sinclair; sadly, only about half are reproduced here, but that's better than none at all. This edition includes a new foreword by Gary Gygax, inventor of Dungeons and Dragons. Radical feminists of the dour and scornful persuasion should be sure to take their medication before reading the subtitle or the last paragraph on page 6 (lines 162–166 of the Gutenberg edition).

 Permalink

October 2006

Dworkin, Ronald W. Artificial Happiness. New York: Carroll & Graf, 2006. ISBN 0-7867-1714-9.
Western societies, with the United States in the lead, appear to be embarked on a grand scale social engineering experiment with little consideration of the potentially disastrous consequences both for individuals and the society at large. Over the last two decades “minor depression”, often no more than what, in less clinical nomenclature one would term unhappiness, has become seen as a medical condition treatable with pharmaceuticals, and prescription of these medications, mostly by general practitioners, not psychiatrists or psychologists, has skyrocketed, with drugs such as Prozac, Paxil, and Zoloft regularly appearing on lists of the most frequently prescribed. Tens of million of people in the United States take these pills, which are being prescribed to children and adolescents as well as adults.

Now, there's no question that these medications have been a Godsend for individuals suffering from severe clinical depression, which is now understood in many cases to be an organic disease caused by imbalances in the metabolism of neurotransmitters in the brain. But this vast public health experiment in medicating unhappiness is another thing altogether. Unhappiness, like pain, is a signal that something's wrong, and a motivator to change things for the better. But if unhappiness is seen as a disease which is treated by swallowing pills, this signal is removed, and people are numbed or stupefied out of taking action to eliminate the cause of their unhappiness: changing jobs or careers, reducing stress, escaping from abusive personal relationships, or embarking on some activity which they find personally rewarding. Self esteem used to be thought of as something you earned from accomplishing difficult things; once it becomes a state of mind you get from a bottle of pills, then what will become of all the accomplishments the happily medicated no longer feel motivated to achieve?

These are serious questions, and deserve serious investigation and a book-length treatment of the contemporary scene and trends. This is not, however, that book. The author is an M.D. anæsthesiologist with a Ph.D. in political philosophy from Johns Hopkins University, and a senior fellow at the Hudson Institute—impressive credentials. Notwithstanding them, the present work reads like something written by somebody who learned Marxism from a comic book. Individuals, entire professions, and groups as heterogeneous as clergy of organised religions are portrayed like cardboard cutouts—with stick figures drawn on them—in crayon. Each group the author identifies is seen as acting monolithically toward a specific goal, which is always nefarious in some way, advancing an agenda based solely on its own interest. The possibility that a family doctor might prescribe antidepressants for an unhappy patient in the belief that he or she is solving a problem for the patient is scarcely considered. No, the doctor is part of a grand conspiracy of “primary care physicians” advancing an agenda to usurp the “turf” (a term he uses incessantly) of first psychiatrists, and finally organised religion.

After reading this entire book, I still can't decide whether the author is really as stupid as he seems, or simply writes so poorly that he comes across that way. Each chapter starts out lurching toward a goal, loses its way and rambles off in various directions until the requisite number of pages have been filled, and then states a conclusion which is not justified by the content of the chapter. There are few cliches in the English language which are not used here—again and again. Here is an example of one of hundreds of paragraphs to which the only rational reaction is “Huh?”.

So long as spirituality was an idea, such as believing in God, it fell under religious control. However, if doctors redefined spirituality to mean a sensual phenomenon—a feeling—then doctors would control it, since feelings had long since passed into the medical profession's hands, the best example being unhappiness. Turning spirituality into a feeling would also help doctors square the phenomenon with their own ideology. If spirituality were redefined to mean a feeling rather than an idea, then doctors could group spirituality with all the other feelings, including unhappiness, thereby preserving their ideology's integrity. Spirituality, like unhappiness, would become a problem of neurotransmitters and a subclause of their ideology. (Page 226.)
A reader opening this book is confronted with 293 pages of this. This paragraph appears in chapter nine, “The Last Battle”, which describes the Manichean struggle between doctors and organised religion in the 1990s for the custody of the souls of Americans, ending in a total rout of religion. Oh, you missed that? Me too.

Mass medication with psychotropic drugs is a topic which cries out for a statistical examination of its public health dimensions, but Dworkin relates only anecdotes of individuals he has known personally, all of whose minds he seems to be able to read, diagnosing their true motivations which even they don't perceive, and discerning their true destiny in life, which he believes they are failing to follow due to medication for unhappiness.

And if things weren't muddled enough, he drags in “alternative medicine” (the modern, polite term for what used to be called “quackery”) and ”obsessive exercise” as other sources of Artificial Happiness (which he capitalises everywhere), which is rather odd since he doesn't believe either works except through the placebo effect. Isn't it just a little bit possible that some of those people working out at the gym are doing so because it makes them feel better and likely to live longer? Dworkin tries to envision the future for the Happy American, decoupled from the traditional trajectory through life by the ability to experience chemically induced happiness at any stage. Here, he seems to simultaneously admire and ridicule the culture of the 1950s, of which his knowledge seems to be drawn from re-runs of “Leave it to Beaver”. In the conclusion, he modestly proposes a solution to the problem which requires completely restructuring medical education for general practitioners and redefining the mission of all organised religions. At least he doesn't seem to have a problem with self-esteem!

 Permalink

Peters, Eric. Automotive Atrocities. St. Paul, MN: Motorbooks International, 2004. ISBN 0-7603-1787-9.
Oh my, oh my, there really were some awful automobiles on the road in the 1970s and 1980s, weren't there? Those born too late to experience them may not be fully able to grasp the bumper to bumper shoddiness of such rolling excrescences as the diesel Chevette, the exploding Pinto, Le Car, the Maserati Biturbo, the Cadillac V-8-6-4 and even worse diesel; bogus hamster-powered muscle cars (“now with a black stripe and fake hood scoop, for only $5000 more!”); the Yugo, the DeLorean, and the Bricklin—remember that one?

They're all here, along with many more vehicles which, like so many things of that era, can only elicit in those who didn't live through it, the puzzled response, “What were they thinking?” Hey, I lived through it, and that's what I used to think when blowing past multi-ton wheezing early 80s Thunderbirds (by then, barely disguised Ford Fairmonts) in my 1972 VW bus!

Anybody inclined toward automotive Schadenfreude will find this book enormously entertaining, as long as you weren't one of the people who spent your hard-earned, rapidly-inflating greenbacks for one of these regrettable rolling rustbuckets. Unlike many automotive books, this one is well-produced and printed, has few if any typographical errors, and includes many excerpts from the contemporary sales material which recall just how slimy and manipulative were the campaigns used to foist this junk off onto customers who, one suspects, the people selling it referred to in the boardroom as “the rubes”.

It is amazing to recall that almost a generation exists whose entire adult experience has been with products which, with relatively rare exceptions, work as advertised, don't break as soon as you take them home, and rapidly improve from year to year. Those of us who remember the 1970s took a while to twig to the fact that things had really changed once the Asian manufacturers raised the quality bar a couple of orders of magnitude above where the U.S. companies thought they had optimised their return.

In the interest of full disclosure, I will confess that I once drove a 1966 MGB, but I didn't buy it new! To grasp what awaited the seventies denizen after they parked the disco-mobile and boogied into the house, see Interior Desecrations (December 2004).

 Permalink

Vilenkin, Alexander. Many Worlds in One. New York: Hill and Wang, 2006. ISBN 0-8090-9523-8.
From the dawn of the human species until a time within the memory of many people younger than I, the origin of the universe was the subject of myth and a topic, if discussed at all within the academy, among doctors of divinity, not professors of physics. The advent of precision cosmology has changed that: the ultimate questions of origin are not only legitimate areas of research, but something probed by satellites in space, balloons circling the South Pole, and mega-projects of Big Science. The results of these experiments have, in the last few decades, converged upon a consensus from which few professional cosmologists would dissent:
  1. At the largest scale, the geometry of the universe is indistinguishable from Euclidean (flat), and the distribution of matter and energy within it is homogeneous and isotropic.
  2. The universe evolved from an extremely hot, dense, phase starting about 13.7 billion years ago from our point of observation, which resulted in the abundances of light elements observed today.
  3. The evidence of this event is imprinted on the cosmic background radiation which can presently be observed in the microwave frequency band. All large-scale structures in the universe grew from gravitational amplification of scale-independent quantum fluctuations in density.
  4. The flatness, homogeneity, and isotropy of the universe is best explained by a period of inflation shortly after the origin of the universe, which expanded a tiny region of space, smaller than a subatomic particle, to a volume much greater than the presently observable universe.
  5. Consequently, the universe we can observe today is bounded by a horizon, about forty billion light years away in every direction (greater than the 13.7 billion light years you might expect since the universe has been expanding since its origin), but the universe is much, much larger than what we can see; every year another light year comes into view in every direction.
Now, this may seem mind-boggling enough, but from these premises, which it must be understood are accepted by most experts who study the origin of the universe, one can deduce some disturbing consequences which seem to be logically unavoidable.

Let me walk you through it here. We assume the universe is infinite and unbounded, which is the best estimate from precision cosmology. Then, within that universe, there will be an infinite number of observable regions, which we'll call O-regions, each defined by the volume from which an observer at the centre can have received light since the origin of the universe. Now, each O-region has a finite volume, and quantum mechanics tells us that within a finite volume there are a finite number of possible quantum states. This number, although huge (on the order of 1010123 for a region the size of the one we presently inhabit), is not infinite, so consequently, with an infinite number of O-regions, even if quantum mechanics specifies the initial conditions of every O-region completely at random and they evolve randomly with every quantum event thereafter, there are only a finite number of histories they can experience (around 1010150). Which means that, at this moment, in this universe (albeit not within our current observational horizon), invoking nothing as fuzzy, weird, or speculative as the multiple world interpretation of quantum mechanics, there are an infinite number of you reading these words scribbled by an infinite number of me. In the vast majority of our shared universes things continue much the same, but from time to time they d1v3r93 r4ndtx#e~—….

Reset . . .
Snap back to universe of origin . . .
Reloading initial vacuum parameters . . .
Restoring simulation . . .
Resuming from checkpoint.
What was that? Nothing, I guess. Still, odd, that blip you feel occasionally. Anyway, here is a completely fascinating book by a physicist and cosmologist who is pioneering the ragged edge of what the hard evidence from the cosmos seems to be telling us about the apparently boundless universe we inhabit. What is remarkable about this model is how generic it is. If you accept the best currently available evidence for the geometry and composition of the universe in the large, and agree with the majority of scientists who study such matters how it came to be that way, then an infinite cosmos filled with observable regions of finite size and consequently limited diversity more or less follows inevitably, however weird it may seem to think of an infinity of yourself experiencing every possible history somewhere. Further, in an infinite universe, there are an infinite number of O-regions which contain every possible history consistent with the laws of quantum mechanics and the symmetries of our spacetime including those in which, as the author noted, perhaps using the phrase for the first time in the august pages of the Physical Review, “Elvis is still alive”.

So generic is the prediction, there's no need to assume the correctness of speculative ideas in physics. The author provides a lukewarm endorsement of string theory and the “anthropic landscape” model, but is clear to distinguish its “multiverse” of distinct vacua with different moduli from our infinite universe with (as far as we know) a single, possibly evolving, vacuum state. But string theory could be completely wrong and the deductions from observational cosmology would still stand. For that matter, they are independent of the “eternal inflation” model the book describes in detail, since they rely only upon observables within the horizon of our single “pocket universe”.

Although the evolution of the universe from shortly after the end of inflation (the moment we call the “big bang”) seems to be well understood, there are still deep mysteries associated with the moment of origin, and the ultimate fate of the universe remains an enigma. These questions are discussed in detail, and the author makes clear how speculative and tentative any discussion of such matters must be given our present state of knowledge. But we are uniquely fortunate to be living in the first time in all of history when these profound questions upon which humans have mused since antiquity have become topics of observational and experimental science, and a number of experiments now underway and expected in the next few years which bear upon them are described.

Curiously, the author consistently uses the word “google” for the number 10100. The correct name for this quantity, coined in 1938 by nine-year-old Milton Sirotta, is “googol”. Edward Kasner, young Milton's uncle, then defined “googolplex” as 1010100. “Google” is an Internet search engine created by megalomaniac collectivists bent on monetising, without compensation, content created by others. The text is complemented by a number of delightful cartoons reminiscent of those penned by George Gamow, a physicist the author (and this reader) much admires.

 Permalink

Rowsome, Frank, Jr. The Verse by the Side of the Road. New York: Plume, [1965] 1979. ISBN 0-452-26762-5.
In the years before the Interstate Highway System, long trips on the mostly two-lane roads in the United States could bore the kids in the back seat near unto death, and drive their parents to the brink of homicide by the incessant drone of “Are we there yet?” which began less than half an hour out of the driveway. A blessed respite from counting cows, license plate poker, and counting down the dwindling number of bottles of beer on the wall would be the appearance on the horizon of a series of six red and white signs, which all those in the car would strain their eyes to be the first to read.

WITHIN THIS VALE

OF TOIL

AND SIN

YOUR HEAD GROWS BALD

BUT NOT YOUR CHIN—USE

Burma-Shave

In the fall of 1925, the owners of the virtually unknown Burma-Vita company of Minneapolis came up with a new idea to promote the brushless shaving cream they had invented. Since the product would have particular appeal to travellers who didn't want to pack a wet and messy shaving brush and mug in their valise, what better way to pitch it than with signs along the highways frequented by those potential customers? Thus was born, at first only on a few highways in Minnesota, what was to become an American institution for decades and a fondly remembered piece of Americana, the Burma-Shave signs. As the signs proliferated across the landscape, so did sales; so rapid was the growth of the company in the 1930s that a director of sales said (p. 38), “We never knew that there was a depression.” At the peak the company had more than six million regular customers, who were regularly reminded to purchase the product by almost 7000 sets of signs—around 40,000 individual signs, all across the country.

While the first signs were straightforward selling copy, Burma-Shave signs quickly evolved into the characteristic jingles, usually rhyming and full of corny humour and outrageous puns. Rather than hiring an advertising agency, the company ran national contests which paid $100 for the best jingle and regularly received more than 50,000 entries from amateur versifiers.

Almost from the start, the company devoted a substantial number of the messages to highway safety; this was not the result of outside pressure from anti-billboard forces as I remember hearing in the 1950s, but based on a belief that it was the right thing to do—and besides, the sixth sign always mentioned the product! The set of signs above is the jingle that most sticks in my memory: it was a favourite of the Burma-Shave founders as well, having been re-run several times since its first appearance in 1933 and chosen by them to be immortalised in the Smithsonian Institution. Another that comes immediately to my mind is the following, from 1960, on the highway safety theme:

THIRTY DAYS

HATH SEPTEMBER

APRIL

JUNE AND THE

SPEED OFFENDER

Burma-Shave

Times change, and with the advent of roaring four-lane freeways, billboard bans or set-back requirements which made sequences of signs unaffordable, the increasing urbanisation of American society, and of course the dominance of television over all other advertising media, by the early 1960s it was clear to the management of Burma-Vita that the road sign campaign was no longer effective. They had already decided to phase it out before they sold the company to Philip Morris in 1963, after which the signs were quickly taken down, depriving the two-lane rural byways of America of some uniquely American wit and wisdom, but who ever drove them any more, since the Interstate went through?

The first half of this delightful book tells the story of the origin, heyday, and demise of the Burma-Shave signs, and the balance lists all of the six hundred jingles preserved in the records of the Burma-Vita Company, by year of introduction. This isn't something you'll probably want to read straight through, but it's great to pick up from time to time when you want a chuckle.

And then the last sign had been read: all the family exclaimed in unison, “Burma-Shave!”. It had been maybe sixty miles since the last set of signs, and so they'd recall that one and remember other great jingles from earlier trips. Then things would quiet down for a while. “Are we there yet?”

 Permalink

Karsh, Efraim. Islamic Imperialism. New Haven, CT: Yale University Press, 2006. ISBN 0-300-10603-3.
A great deal of conflict and tragedy might have been avoided in recent years had only this 2006 book been published a few years earlier and read by those contemplating ambitious adventures to remake the political landscape of the Near East and Central Asia. The author, a professor of history at King's College, University of London, traces the repeated attempts, beginning with Muhammad and his immediate successors, to establish a unified civilisation under the principles of Islam, in which the Koranic proscription of conflict among Muslims would guarantee permanent peace.

In the century following the Prophet's death in the year 632, Arab armies exploded out of the birthplace of Islam and conquered a vast territory from present-day Iran to Spain, including the entire north African coast. This was the first of a succession of great Islamic empires, which would last until the dismantling of the Ottoman Empire in the aftermath of World War I. But, as this book thoroughly documents, over this entire period, the emphasis was on the word “empire” and not “Islamic”. While the leaders identified themselves as Muslims and exhorted their armies to holy war, the actual empires were very much motivated by a quest for temporal wealth and power, and behaved much as the previous despotisms they supplanted. Since the Arabs had no experience in administering an empire nor a cadre of people trained in those arts, they ended up assimilating the bureaucratic structure and personnel of the Persian empire after conquering it, and much the same happened in the West after the fall of the Byzantine empire.

While soldiers might have seen themselves as spreading the word of Islam by the sword, in fact the conquests were mostly about the traditional rationale for empire: booty and tribute. (The Prophet's injunction against raiding other Muslims does appear to have been one motivation for outward-directed conquest, especially in the early years.) Not only was there relatively little aggressive proselytising of Islam, on a number of occasions conversion to Islam by members of dhimmi populations was discouraged or prohibited outright because the imperial treasury depended heavily on the special taxes non-Muslims were required to pay. Nor did these empires resemble the tranquil Dar al-Islam envisaged by the Prophet—in fact, only 24 years would elapse after his death before the Caliph Uthman was assassinated by his rivals, and that would be first of many murders, revolutions, plots, and conflicts between Muslim factions within the empires to come.

Nor were the Crusades, seen through contemporary eyes, the cataclysmic clash of civilisations they are frequently described as today. The kingdoms established by the crusaders rapidly became seen as regional powers like any other, and often found themselves in alliance with Muslims against Muslims. Pan-Arabists in modern times who identify their movement with opposition to the hated crusader often fail to note that there was never any unified Arab campaign against the crusaders; when they were finally ejected, it was by the Turks, and their great hero Saladin was, himself, a Kurd.

The latter half of the book recounts the modern history of the Near East, from Churchill's invention of Iraq, through Nasser, Khomeini, and the emergence of Islamism and terror networks directed against Israel and the West. What is simultaneously striking and depressing about this long and detailed history of strife, subversion, oppression, and conflict is that you can open it up to almost any page and apart from a few details, it sounds like present-day news reports from the region. Thirteen centuries of history with little or no evidence for indigenous development of individual liberty, self-rule, the rule of law, and religious tolerance does not bode well for idealistic neo-Jacobin schemes to “implant democracy” at the point of a bayonet. (Modern Turkey can be seen as a counter-example, but it is worth observing that Mustafa Kemal explicitly equated modernisation with the importation and adoption of Western values, and simultaneously renounced imperial ambitions. In this, he was alone in the region.)

Perhaps the lesson one should draw from this long and tragic narrative is that this unfortunate region of the world, which was a fiercely-contested arena of human conflict thousands of years before Muhammad, has resisted every attempt by every actor, the Prophet included, to pacify it over those long millennia. Rather than commit lives and fortune to yet another foredoomed attempt to “fix the problem”, one might more wisely and modestly seek ways to keep it contained and not aggravate the situation.

 Permalink

Finkbeiner, Ann. The Jasons. New York: Viking, 2006. ISBN 0-670-03489-4.
Shortly after the launch of Sputnik thrust science and technology onto the front lines of the Cold War, a group of Manhattan Project veterans led by John Archibald Wheeler decided that the government needed the very best advice from the very best people to navigate these treacherous times, and that the requisite talent was not to be found within the weapons labs and other government research institutions, but in academia and industry, whence it should be recruited to act as an independent advisory panel. This fit well with the mandate of the recently founded ARPA (now DARPA), which was chartered to pursue “high-risk, high-payoff” projects, and needed sage counsel to minimise the former and maximise the latter.

The result was Jason (the name is a reference to Jason of the Argonauts, and is always used in the singular when referring to the group, although the members are collectively called “Jasons”). It is unlikely such a scientific dream team has ever before been assembled to work together on difficult problems. Since its inception in 1960, a total of thirteen known members of Jason have won Nobel prizes before or after joining the group. Members include Eugene Wigner, Charles Townes (inventor of the laser), Hans Bethe (who figured out the nuclear reaction that powers the stars), polymath and quark discoverer Murray Gell-Mann, Freeman Dyson, Val Fitch, Leon Lederman, and more, and more, and more.

Unlike advisory panels who attend meetings at the Pentagon for a day or two and draft summary reports, Jason members gather for six weeks in the summer and work together intensively, “actually solving differential equations”, to produce original results, sometimes inventions, for their sponsors. The Jasons always remained independent—while the sponsors would present their problems to them, it was the Jasons who chose what to work on.

Over the history of Jason, missile defence and verification of nuclear test bans have been a main theme, but along the way they have invented adaptive optics, which has revolutionised ground-based astronomy, explored technologies for detecting antipersonnel mines, and created, in the Vietnam era, the modern sensor-based “electronic battlefield”.

What motivates top-ranked, well-compensated academic scientists to spend their summers in windowless rooms pondering messy questions with troubling moral implications? This is a theme the author returns to again and again in the extensive interviews with Jasons recounted in this book. The answer seems to be something so outré on the modern university campus as to be difficult to vocalise: patriotism, combined with a desire so see that if such things be done, they should be done as wisely as possible.

 Permalink

November 2006

Steyn, Mark. America Alone. Washington: Regnery Publishing, 2006. ISBN 0-89526-078-6.
Leave it to Mark Steyn to write a funny book about the collapse of Western civilisation. Demographics are destiny, and unlike political and economic trends, are easier to extrapolate because the parents of the next generation have already been born: if there are more of them than their own parents, a population is almost certain to increase, and if there are fewer, the population is destined to fall. Once fertility drops to 1.3 children per woman or fewer, a society enters a demographic “death spiral” from which there is no historical precedent for recovery. Italy, Spain, and Russia are already below this level, and the European Union as a whole is at 1.47, far below the replacement rate of 2.1. And what's the makeup of this shrinking population of Europe? Well, we might begin by asking what is the most popular name for boys born in Belgium…and Amsterdam…and Malmö, Sweden: Mohammed. Where is this going? Well, in the words of Mullah Krekar of Norway (p. 39), “We're the ones who will change you. Every Western woman in the EU is producing an average of 1.4 children. Every Muslim woman in the same countries is producing 3.5 children. By 2050, 30 percent of the population in Europe will be Muslim. Our way of thinking…will prove more powerful than yours.”

The author believes, and states forthrightly, that it is the purest fantasy to imagine that this demographic evolution, seen by many of the élite as the only hope of salvation for the European welfare state, can occur without a profound change in the very nature of the societies in which it occurs. The end-point may not be “Eutopia”, but rather “Eurabia”, and the timidity of European nations who already have an urban Muslim population approaching 30% shows how a society which has lost confidence in its own civilisation and traditions and imbibed the feel-good but ultimately debilitating doctrine of multiculturalism ends up assimilating to the culture of the immigrants, not the other way around. Steyn sees only three possible outcomes for the West (p. 204):

  1. Submit to Islam
  2. Destroy Islam
  3. Reform Islam
If option one is inconceivable and option two unthinkable (and probably impossible, certainly without changing Western civilisation beyond recognition and for the worse), you're left with number three, but, as Steyn notes, “Ultimately, only Muslims can reform Islam”. Unfortunately, the recent emergence of a global fundamentalist Islamic identity with explicitly political goals may be the Islamic Reformation, and if that be the case, the trend is going in the wrong direction. So maybe option one isn't off the table, after all.

The author traces the roots of the European predicament to the social democratic welfare state, which like all collectivist schemes, eventually creates a society of perpetual adolescents who never mature into and assume the responsibilities of adults. When the state becomes responsible for all the things the family once had to provide for, and is supported by historically unprecedented levels of taxation which impoverish young families and make children unaffordable, why not live for the present and let the next generation, wherever it may come from, worry about itself? In a static situation, this is a prescription for the kind of societal decline which can be seen in the histories of both Greece and Rome, but when there is a self-confident, rapidly-proliferating immigrant population with no inclination to assimilate, it amounts to handing the keys over to the new tenants in a matter of decades.

Among Western countries, the United States is the great outlier, with fertility just at the replacement rate and immigrants primarily of Hispanic origin who have, historically, assimilated to U.S. society in a generation or two. (There are reasons for concern about the present rate of immigration to the U.S. and the impact of multiculturalism on assimilation there, but that is not the topic of this book.) Steyn envisages a future, perhaps by 2050, where the U.S. looks out upon the world and sees not an “end of history” with liberal democracy and free markets triumphant around the globe but rather (p. 205), “a totalitarian China, a crumbling Russia, an insane Middle East, a disease-ridden Africa, [and] a civil war-torn Eurabia”—America alone.

Heavy stuff, but Steyn's way with words will keep you chuckling as you contemplate the apocalypse. The book is long on worries and short on plausible solutions, other than a list of palliatives which it is unlikely Western societies, even the U.S., have the will to adopt, although the author predicts (p. 192) “By 2015, almost every viable political party in the West will be natalist…”. But demographics don't turn on a dime, and by then, whatever measures are politically feasible may be too little to make much difference.

 Permalink

Macdonald, Lyn. 1915: The Death of Innocence. London: Penguin Books, [1993] 1997. ISBN 0-14-025900-7.
I'm increasingly coming to believe that World War I was the defining event of the twentieth century: not only a cataclysm which destroyed the confident assumptions of the past, but which set history inexorably on a path which would lead to even greater tragedies and horrors as that century ran its course. This book provides an excellent snapshot of what the British people, both at the front and back home, were thinking during the first full year of the war, as casualties mounted and hope faded for the quick victory almost all expected at the outset.

The book does not purport to be a comprehensive history of the war, nor even of the single year it chronicles. It covers only the British Army: the Royal Navy is mentioned only in conjunction with troop transport and landings, and the Royal Flying Corps scarcely at all. The forces of other countries, allied or enemy, are mentioned only in conjunction with their interaction with the British, and no attempt is made to describe the war from their perspective. Finally, the focus is almost entirely on the men in the trenches and their commanders in the field: there is little focus on the doings of politicians and the top military brass, nor on grand strategy, although there was little of that in evidence in the events of 1915 in any case.

Within its limited scope, however, the book succeeds superbly. About a third of the text is extended quotations from people who fought at the front, many from contemporary letters home. Not only do you get an excellent insight into how horrific conditions were in the field, but also how stoically those men accepted them, hardly ever questioning the rationale for the war or the judgement of those who commanded them. And this in the face of a human cost which is nearly impossible to grasp by the standards of present-day warfare. Between the western front and the disastrous campaign in Gallipoli, the British suffered more than half a million casualties (killed, wounded, and missing) (p. 597). In “quiet periods” when neither side was mounting attacks, simply manning their own trenches, British casualties averaged five thousand a week (p. 579), mostly from shelling and sniper fire.

And all of the British troops who endured these appalling conditions were volunteers—conscription did not begin in Britain until 1916. With the Regular Army having been largely wiped out in the battles of 1914, the trenches were increasingly filled with Territorial troops who volunteered for service in France, units from around the Empire: India, Canada, Australia, and New Zealand, and as the year progressed, Kitchener's “New Army” of volunteer recruits rushed through training and thrown headlong into the killing machine. The mindset that motivated these volunteers and the conclusions drawn from their sacrifice set the stage for the even greater subsequent horrors of the twentieth century.

Why? Because they accepted as given that their lives were, in essence, the property of the state which governed the territory in which they happened to live, and that the rulers of that state, solely on the authority of having been elected by a small majority of the voters in an era when suffrage was far from universal, had every right to order them to kill or be killed by subjects of other states with which they had no personal quarrel. (The latter point was starkly illustrated when, at Christmas 1914, British and German troops declared an impromptu cease-fire, fraternised, and played football matches in no man's land before, the holiday behind them, returning to the trenches to resume killing one another for King and Kaiser.) This was a widely shared notion, but the first year of the Great War demonstrated that the populations of the countries on both sides really believed it, and would charge to almost certain death even after being told by Lord Kitchener himself on the parade ground, “that our attack was in the nature of a sacrifice to help the main offensive which was to be launched ‘elsewhere’” (p. 493). That individuals would accept their rôle as property of the state was a lesson which the all-encompassing states of the twentieth century, both tyrannical and more or less democratic, would take to heart, and would manifest itself not only in conscription and total war, but also in expropriation, confiscatory taxation, and arbitrary regulation of every aspect of subjects' lives. Once you accept that the state is within its rights to order you to charge massed machine guns with a rifle and bayonet, you're unlikely to quibble over lesser matters.

Further, the mobilisation of the economy under government direction for total war was taken as evidence that central planning of an industrial economy was not only feasible but more efficient than the market. Unfortunately, few observed that there is a big difference between consuming capital to build the means of destruction over a limited period of time and creating new wealth and products in a productive economy. And finally, governments learnt that control of mass media could mould the beliefs of their subjects as the rulers wished: the comical Fritz with which British troops fraternised at Christmas 1914 had become the detested Boche whose trenches they shelled continuously on Christmas Day a year later (p. 588).

It is these disastrous “lessons” drawn from the tragedy of World War I which, I suspect, charted the tragic course of the balance of the twentieth century and the early years of the twenty-first. Even a year before the outbreak of World War I, almost nobody imagined such a thing was possible, or that it would have the consequences it did. One wonders what will be the equivalent defining event of the twenty-first century, when it will happen, and in what direction it will set the course of history?

A U.S. edition is also available.

 Permalink

Ronan, Mark. Symmetry and the Monster. Oxford: Oxford University Press, 2006. ISBN 0-19-280722-6.
On the morning of May 30th, 1832, self-taught mathematical genius and revolutionary firebrand Évariste Galois died in a duel in Paris, the reasons for which are forgotten; he was twenty years old. The night before, he wrote a letter in which he urged that his uncompleted mathematical work be sent to the preeminent contemporary mathematicians Jacobi and Gauss; neither, however, ever saw it. The work in question laid the foundations for group theory, an active area of mathematical research a century and three quarters hence, and a cornerstone of the most fundamental theories of physics: Noether's Theorem demonstrates that conservation laws and physical symmetries are two aspects of the same thing.

Finite groups, which govern symmetries among a finite number of discrete items (as opposed to, say, the rotations of a sphere, which are continuously valued), can be arbitrarily complicated, but, as shown by Galois, can be decomposed into one or more simple groups whose only normal subgroups are the trivial subgroup of order one and the improper subgroup consisting of the entire group itself: these are the fundamental kinds of symmetries or, as this book refers to them, the “atoms of symmetry”, and there are only five categories (four of the five categories are themselves infinite). The fifth category are the sporadic groups, which do not fit into any of the other categories. The first was discovered by Émile Mathieu in 1861, and between then and 1873 he found four more. As group theory continued to develop, mathematicians kept finding more and more of these sporadic groups, and nobody knew whether there were only a finite number or infinitely many of them…until recently.

Most research papers in mathematics are short and concise. Some group theory papers are the exception, with two hundred pagers packed with dense notation not uncommon. The classification theorem of finite groups is the ultimate outlier; it has been likened to the Manhattan Project of pure mathematics. Consisting of hundreds of papers published over decades by a large collection of authors, it is estimated, if every component involved in the proof were collected together, to be on the order of fifteen thousand pages, many of which are so technical those not involved in the work itself have extreme difficulty understanding them. (In fact, a “Revision project” is currently underway with the goal of restating the proof in a form which future generations of mathematicians will be able to comprehend.) The last part of the classification theorem, itself more than a thousand pages in length, was not put into place until November 2004, so only then could one say with complete confidence that there were only 26 sporadic groups, all of which are known.

While these groups are “simple” in the sense of not being able to be decomposed, the symmetries most of them represent are of mind-boggling complexity. The order of a finite group is the number of elements it contains; for example, the group of permutations on five items has an order of 5! = 120. The simplest sporadic group has an order of 7920 and the biggest, well, it's a monster. In fact, that's what it's called, the “monster group”, and its order is (deep breath):

808,017,424,794,512,875,886,459,904,961,710,757,005,754,368,000,000,000 =
246×320×59×76×112×133×17×19×23×29×31×41×47×59×71
If it helps, you can think of the monster as the group of rotations in a space of 196,884 dimensions—much easier to visualise, isn't it? In any case, that's how Robert Griess first constructed the monster in 1982, in a 102 page paper done without a computer.

In one of those “take your breath away” connections between distant and apparently unrelated fields of mathematics, the divisors of the order of the monster are precisely the 15 supersingular primes, which are intimately related to the j-function of number theory. Other striking coincidences, or maybe deep connections, link the monster group to the Lorentzian geometry of general relativity, the multidimensional space of string theory, and the enigmatic properties of the number 163 in number theory. In 1983, Freeman Dyson mused, “I have a sneaking hope, a hope unsupported by any facts or any evidence, that sometime in the twenty-first century physicists will stumble upon the Monster group, built in some unsuspected way into the structure of the universe.” Hey, stranger things have happened.

This book, by a professional mathematician who is also a talented populariser of the subject, tells the story of this quest. During his career, he personally knew almost all of the people involved in the classification project, and leavens the technical details with biographical accounts and anecdotes of the protagonists. To avoid potentially confusing mathematical jargon, he uses his own nomenclature: “atom of symmetry” instead of finite simple group, “deconstruction” instead of decomposition, and so on. This sometimes creates its own confusion, since the extended quotes from mathematicians use the standard terminology; the reader should refer to the glossary at the end of the book to resolve any such puzzlement.

 Permalink

Meers, Nick. Stretch: The World of Panoramic Photography. Mies, Switzerland: RotoVision, 2003. ISBN 2-88046-692-X.
In the early years of the twentieth century, panoramic photography was all the rage. Itinerant photographers with unwieldy gear such as the Cirkut camera would visit towns to photograph and sell 360° panoramas of the landscape and wide format pictures of large groups of people, such as students at the school or workers at a factory or mine. George Lawrence's panoramas (some taken from a camera carried aloft by a kite) of the devastation resulting from the 1906 San Francisco earthquake and fire have become archetypal images of that disaster.

Although pursued as an art form by a small band of photographers, and still used occasionally for large group portraits, the panoramic fad largely died out with the popularity of fixed-format roll film cameras and the emergence of the ubiquitous 24×36 mm format. The advent of digital cameras and desktop image processing software able to “stitch” multiple images more or less seamlessly (if you know what you're doing when you take them) into an arbitrarily wide panorama has sparked a renaissance in the format, including special-purpose film and digital cameras for panoramic photography. Computers with high performance graphics hardware now permit viewing full-sphere virtual reality imagery in which the viewer can “look around” at will, something undreamed of in the first golden age of panoramas.

This book provides an introduction to the history, technology, and art of panoramic photography, alternating descriptions of equipment and technique with galleries featuring the work of contemporary masters of the format, including many examples of non-traditional subjects for panoramic presentation which will give you ideas for your own experiments. The book, which is beautifully printed in China, is itself in “panoramic format” with pages 30 cm wide by 8 cm tall for an aspect ratio of 3¾:1, allowing many panoramic pictures to be printed on a single page. (There are a surprising number of vertical panoramas in the examples which are short-changed by the page format, as they are always printed vertically rather than asking you to turn the book around to view them.) Although the quality of reproduction is superb, the typography is frankly irritating, at least to my ageing eyes. The body copy is set in a light sans-serif font with capitals about six points tall, and photo captions in even smaller type: four point capitals. If that wasn't bad enough, all of the sections on technique are printed in white type on a black background which, especially given the high reflectivity of the glossy paper, is even more difficult to read. This appears to be entirely for artistic effect— there is plenty of white (or black) space which would have permitted using a more readable font. The cover price of US$30 seems high for a work of fewer than 150 pages, however wide and handsome.

 Permalink

Roth, Philip. The Plot Against America. New York: Vintage, 2004. ISBN 1-4000-7949-7.
Pulitzer Prize-winning mainstream novelist Philip Roth turns to alternative history in this novel, which also falls into the genre Rudy Rucker pioneered and named “transreal”—autobiographical fiction, in which the author (or a character clearly based upon him) plays a major part in the story. Here, the story is told in the first person by the author, as a reminiscence of his boyhood in the early 1940s in Newark, New Jersey. In this timeline, however, after a deadlocked convention, the Republican party chooses Charles Lindbergh as its 1940 presidential candidate who, running on an isolationist platform of “Vote for Lindbergh or vote for war”, defeats FDR's bid for a third term in a landslide.

After taking office, Lindbergh's tilt toward the Axis becomes increasingly evident. He appoints the virulently anti-Semitic Henry Ford as Secretary of the Interior, flies to Iceland to sign a pact with Hitler, and a concludes a treaty with Japan which accepts all its Asian conquests so far. Further, he cuts off all assistance to Britain and the USSR. On the domestic front, his Office of American Absorption begins encouraging “urban” children (almost all of whom happen to be Jewish) to spend their summers on farms in the “heartland” imbibing “American values”, and later escalates to “encouraging” the migration of entire families (who happen to be Jewish) to rural areas.

All of this, and its many consequences, ranging from trivial to tragic, are seen through the eyes of young Philip Roth, perceived as a young boy would who was living through all of this and trying to make sense of it. A number of anecdotes have nothing to do with the alternative history story line and may be purely autobiographical. This is a “mood novel” and not remotely a thriller; the pace of the story-telling is languid, evoking the time sense and feeling of living in the present of a young boy. As alternative history, I found a number of aspects implausible and unpersuasive. Most exemplars of the genre choose one specific event at which the story departs from recorded history, then spin out the ramifications of that event as the story develops. For example, in 1945 by Newt Gingrich and William Forstchen, after the attack on Pearl Harbor, Germany does not declare war on the United States, which only goes to war against Japan. In Roth's book, the point of divergence is simply the nomination of Lindbergh for president. Now, in the real election of 1940, FDR defeated Wendell Willkie by 449 electoral votes to 82, with the Republican carrying only 10 of the 48 states. But here, with Lindbergh as the nominee, we're supposed to believe that FDR would lose in forty-six states, carrying only his home state of New York and squeaking to a narrow win in Maryland. This seems highly implausible to me—Lindbergh's agitation on behalf of America First made him a highly polarising figure, and his apparent sympathy for Nazi Germany (in 1938 he accepted a gold medal decorated with four swastikas from Hermann Göring in Berlin) made him anathema in much of the media. All of these negatives would have been pounded home by the Democrats, who had firm control of the House and Senate as well as the White House, and all the advantages of incumbency. Turning a 38 state landslide into a 46 state wipeout simply by changing the Republican nominee stretches suspension of disbelief to the limit, at least for this reader, especially as Americans are historically disinclined to elect “outsiders” to the presidency.

If you accept this premise, then most of what follows is reasonably plausible and the descent of the country into a folksy all-American kind of fascism is artfully told. But then something very odd happens. As events are unfolding at their rather leisurely pace, on page 317 it's like the author realised he was about to run out of typewriter ribbon or something, and the whole thing gets wrapped up in ten pages, most of which is an unconfirmed account by one of the characters of behind-the-scenes events which may or may not explain everything, and then there's a final chapter to sort out the personal details. This left me feeling like Charlie Brown when Lucy snatches away the football; either the novel should be longer, or else the pace of the whole thing should be faster rather than this whiplash-inducing discontinuity right before the end—but who am I to give advice to somebody with a Pulitzer?

A postscript provides real-world biographies of the many historical figures who appear in the novel, and the complete text of Lindbergh's September 1941 Des Moines speech to the America First Committee which documents his contemporary sentiments for readers who are unaware of this unsavoury part of his career.

 Permalink

December 2006

Bova, Ben. Mercury. New York: Tor, 2005. ISBN 0-7653-4314-2.
I hadn't read anything by Ben Bova in years—certainly not since 1990. I always used to think of him as a journeyman science fiction writer, cranking out enjoyable stories mostly toward the hard end of the science fiction spectrum, but not a grand master of the calibre of, say, Heinlein, Clarke, and Niven. His stint as editor of Analog was admirable, proving himself a worthy successor to John W. Campbell, who developed the authors of the golden age of science fiction. Bova is also a prolific essayist on science, writing, and other topics, and his January 1965 Analog article “It's Done with Mirrors” with William F. Dawson may have been one of the earliest proposals of a multiply-connected small universe cosmological model.

I don't read a lot of fiction these days, and tend to lose track of authors, so when I came across this book in an airport departure lounge and noticed it was published in 2005, my first reaction was, “Gosh, is he still writing?” (Bova was born in 1932, and his first novel was published in 1959.) The U.K. paperback edition was featured in a “buy one, get one free” bin, so how could I resist?

I ought to strengthen my resistance. This novel is so execrably bad that several times in the process of reading it I was tempted to rip it to bits and burn them to ensure nobody else would have to suffer the experience. There is nothing whatsoever redeeming about this book. The plot is a conventional love triangle/revenge tale. The only thing that makes it science fiction at all is that it's set in the future and involves bases on Mercury, space elevators, and asteroid mining, but these are just backdrops for a story which could take place anywhere. Notwithstanding the title, which places it within the author's “Grand Tour” series, only about half of the story takes place on Mercury, whose particulars play only a small part.

Did I mention the writing? No, I guess I was trying to forget it. Each character, even throw-away figures who appear only in a single chapter, is introduced by a little sketch which reads like something produced by filling out a form. For example,

Jacqueline Wexler was such an administrator. Gracious and charming in public, accommodating and willing to compromise at meetings, she nevertheless had the steel-hard will and sharp intellect to drive the ICU's ramshackle collection of egos toward goals that she herself selected. Widely known as ‘Attila the Honey,’ Wexler was all sweetness and smiles on the outside, and ruthless determination within.
After spending a third of page 70 on this paragraph, which makes my teeth ache just to re-read, the formidable Ms. Wexler walks off stage before the end of p. 71, never to re-appear. But fear not (or fear), there are many, many more such paragraphs in subsequent pages.

An Earth-based space elevator, a science fiction staple, is central to the plot, and here Bova bungles the elementary science of such a structure in a laugh-out-loud chapter in which the three principal characters ride the elevator to a platform located at the low Earth orbit altitude of 500 kilometres. Upon arrival there, they find themselves weightless, while in reality the force of gravity would be imperceptibly less than on the surface of the Earth! Objects in orbit are weightless because their horizontal velocity cancels Earth's gravity, but a station at 500 kilometres is travelling only at the speed of the Earth's rotation, which is less than 1/16 of orbital velocity. The only place on a space elevator where weightlessness would be experienced is the portion where orbital velocity equals Earth's rotation rate, and that is at the anchor point at geosynchronous altitude. This is not a small detail; it is central to the physics, engineering, and economics of space elevators, and it figured prominently in Arthur C. Clarke's 1979 novel The Fountains of Paradise which is alluded to here on p. 140.

Nor does Bova restrain himself from what is becoming a science fiction cliché of the first magnitude: “nano-magic”. This is my term for using the “nano” prefix the way bad fantasy authors use “magic”. For example, Lord Hacksalot draws his sword and cuts down a mighty oak tree with a single blow, smashing the wall of the evil prince's castle. The editor says, “Look, you can't cut down an oak tree with a single swing of a sword.” Author: “But it's a magic sword.” On p. 258 the principal character is traversing a tether between two parts of a ship in the asteroid belt which, for some reason, the author believes is filled with deadly radiation. “With nothing protecting him except the flimsy…suit, Bracknell felt like a turkey wrapped in a plastic bag inside a microwave oven. He knew that high-energy radiation was sleeting down on him from the pale, distant Sun and still-more-distant stars. He hoped that suit's radiation protection was as good as the manufacturer claimed.” Imaginary editor (who clearly never read this manuscript): “But the only thing which can shield you from heavy primary cosmic rays is mass, and lots of it. No ‘flimsy suit’ however it's made, can protect you against iron nuclei incoming near the speed of light.” Author: “But it's a nano suit!”

Not only is the science wrong, the fiction is equally lame. Characters simply don't behave as people do in the real world, nor are events and their consequences plausible. We are expected to believe that the causes of and blame for a technological catastrophe which killed millions would be left to be decided by a criminal trial of a single individual in Ecuador without any independent investigation. Or that a conspiracy to cause said disaster involving a Japanese mega-corporation, two mass religious movements, rogue nanotechnologists, and numerous others could be organised, executed, and subsequently kept secret for a decade. The dénouement hinges on a coincidence so fantastically improbable that the plausibility of the plot would be improved were the direct intervention of God Almighty posited instead.

Whatever became of Ben Bova, whose science was scientific and whose fiction was fun to read? It would be uncharitable to attribute this waste of ink and paper to age, as many science fictioneers with far more years on the clock have penned genuine classics. But look at this! Researching the author's biography, I discovered that in 1996, at the age of 64, he received a doctorate in education from California Coast University, a “distance learning” institution. Now, remember back when you were in engineering school struggling with thermogoddamics and fluid mechanics how you regarded the student body of the Ed school? Well, I always assumed it was a selection effect—those who can do, and those who can't…anyway, it never occurred to me that somewhere in that dark, lowering building they had a nano brain mushifier which turned the earnest students who wished to dedicate their careers to educating the next generation into the cognitively challenged classes they graduated. I used to look forward to reading anything by Ben Bova; I shall, however, forgo further works by the present Doctor of Education.

 Permalink

Gershenfeld, Neil. Fab. New York: Basic Books, 2005. ISBN 0-465-02745-8.
Once, every decade or so, you encounter a book which empowers you in ways you never imagined before you opened it, and ultimately changes your life. This is one of those books. I am who I am (not to sound too much like Popeye) largely because in the fall of 1967 I happened to read Daniel McCracken's FORTRAN book and realised that there was nothing complicated at all about programming computers—it was a vocational skill that anybody could learn, much like operating a machine tool. (Of course, as you get deeper into the craft, you discover there is a great body of theory to master, but there's much you can accomplish if you're willing to work hard and learn on the job before you tackle the more abstract aspects of the art.) But this was not only something that I could do but, more importantly, I could learn by doing—and that's how I decided to spend the rest of my professional life and I've never regretted having done so. I've never met a genuinely creative person who wished to spend a nanosecond in a classroom downloading received wisdom at dial-up modem bandwidth. In fact, I suspect the absence of such people in the general population is due to the pernicious effects of the Bismarck worker-bee indoctrination to which the youth of most “developed” societies are subjected today.

We all know that, some day, society will pass through the nanotechnological singularity, after which we'll be eternally free, eternally young, immortal, and incalculably rich: hey—works for me!   But few people realise that if the age of globalised mass production is analogous to that of mainframe computers and if the desktop nano-fabricator is equivalent to today's personal supercomputer, we're already in the equivalent of the minicomputer age of personal fabrication. Remember minicomputers? Not too large, not too small, and hence difficult to classify: too expensive for most people to buy, but within the budget of groups far smaller than the governments and large businesses who could afford mainframes.

The minicomputer age of personal fabrication is as messy as the architecture of minicomputers of four decades before: there are lots of different approaches, standards, interfaces, all mutually incompatible: isn't innovation wonderful? Well, in this sense no!   But it's here, now. For a sum in the tens of thousands of U.S. dollars, it is now possible to equip a “Fab Lab” which can make “almost anything”. Such a lab can fit into a modestly sized room, and, provided with electrical power and an Internet connection, can empower whoever crosses its threshold to create whatever their imagination can conceive. In just a few minutes, their dream can become tangible hardware in the real world.

The personal computer revolution empowered almost anybody (at least in the developed world) to create whatever information processing technology their minds could imagine, on their own, or in collaboration with others. The Internet expanded the scope of this collaboration and connectivity around the globe: people who have never met one another are now working together to create software which will be used by people who have never met the authors to everybody's mutual benefit. Well, software is cool, but imagine if this extended to stuff. That's what Fab is about. SourceForge currently hosts more than 135,500 software development projects—imagine what will happen when StuffForge.net (the name is still available, as I type this sentence!) hosts millions of OpenStuff things you can download to your local Fab Lab, make, and incorporate into inventions of your own imagination. This is the grand roll-back of the industrial revolution, the negation of globalisation: individuals, all around the world, creating for themselves products tailored to their own personal needs and those of their communities, drawing upon the freely shared wisdom and experience of their peers around the globe. What a beautiful world it will be!

Cynics will say, “Sure, it can work at MIT—you have one of the most talented student bodies on the planet, supported by a faculty which excels in almost every discipline, and an industrial plant with bleeding edge fabrication technologies of all kinds.” Well, yes, it works there. But the most inspirational thing about this book is that it seems to work everywhere: not just at MIT but also in South Boston, rural India, Norway far north of the Arctic Circle, Ghana, and Costa Rica—build it and they will make. At times the author seems unduly amazed that folks without formal education and the advantages of a student at MIT can imagine, design, fabricate, and apply a solution to a problem in their own lives. But we're human beings—tool-making primates who've prospered by figuring things out and finding ways to make our lives easier by building tools. Is it so surprising that putting the most modern tools into the hands of people who daily confront the most fundamental problems of existence (access to clean water, food, energy, and information) will yield innovations which surprise even professors at MIT?

This book is so great, and so inspiring, that I will give the author a pass on his clueless attack on AutoCAD's (never attributed) DXF file format on pp. 46–47, noting simply that the answer to why it's called “DXF” is that Lotus had already used “DIF” for their spreadsheet interchange files and we didn't want to create confusion with their file format, and that the reason there's more than one code for an X co-ordinate is that many geometrical objects require more than one X co-ordinate to define them (well, duh).

The author also totally gets what I've been talking about since Unicard and even before that as “Gizmos”, that every single device in the world, and every button on every device will eventually have its own (IPv6) Internet address and be able to interact with every other such object in every way that makes sense. I envisioned MIDI networks as the cheapest way to implement this bottom-feeder light-switch to light-bulb network; the author, a decade later, opts for a PCM “Internet 0”—works for me. The medium doesn't matter; it's that the message makes it end to end so cheaply that you can ignore the cost of the interconnection that ultimately matters.

The author closes the book with the invitation:

Finally, demand for fab labs as a research project, as a collection of capabilities, as a network of facilities, and even as a technological empowerment movement is growing beyond what can be handled by the initial collection of people and institutional partners that were involved in launching them. I/we welcome your thoughts on, and participation in, shaping their future operational, organizational, and technological form.
Well, I am but a humble programmer, but here's how I'd go about it. First of all, I'd create a “Fabrication Trailer“ which could visit every community in the United States, Canada, and Mexico; I'd send it out on the road in every MIT vacation season to preach the evangel of “make” to every community it visited. In, say, one of eighty of such communities, one would find a person who dreamed of this happening in his or her lifetime who was empowered by seeing it happen; provide them a template which, by writing a cheque, can replicate the fab and watch it spread. And as it spreads, and creates wealth, it will spawn other Fab Labs.

Then, after it's perfected in a couple of hundred North American copies, design a Fab Lab that fits into an ocean cargo container and can be shipped anywhere. If there isn't electricity and Internet connectivity, also deliver the diesel generator or solar panels and satellite dish. Drop these into places where they're most needed, along with a wonk who can bootstrap the locals into doing things with these tools which astound even those who created them. Humans are clever, tool-making primates; give us the tools to realise what we imagine and then stand back and watch what happens!

The legacy media bombard us with conflict, murder, and mayhem. But the future is about creation and construction. What does An Army of Davids do when they turn their creativity and ingenuity toward creating solutions to problems perceived and addressed by individuals? Why, they'll call it a renaissance! And that's exactly what it will be.

For more information, visit the Web site of The Center for Bits and Atoms at MIT, which the author directs. Fab Central provides links to Fab Labs around the world, the machines they use, and the open source software tools you can download and start using today.

 Permalink

Milosz, Czeslaw. The Captive Mind. New York: Vintage, [1951, 1953, 1981] 1990. ISBN 0-679-72856-2.
This book is an illuminating exploration of life in a totalitarian society, written by a poet and acute observer of humanity who lived under two of the tyrannies of the twentieth century and briefly served one of them. The author was born in Lithuania in 1911 and studied at the university in Vilnius, a city he describes (p. 135) as “ruled in turn by the Russians, Germans, Lithuanians, Poles, again the Lithuanians, again the Germans, and again the Russians”—and now again the Lithuanians. An ethnic Pole, he settled in Warsaw after graduation, and witnessed the partition of Poland between Nazi Germany and the Soviet Union at the outbreak of World War II, conquest and occupation by Germany, “liberation” by the Red Army, and the imposition of Stalinist rule under the tutelage of Moscow. After working with the underground press during the war, the author initially supported the “people's government”, even serving as a cultural attaché at the Polish embassies in Washington and Paris. As Stalinist terror descended upon Poland and the rigid dialectical “Method” was imposed upon intellectual life, he saw tyranny ascendant once again and chose exile in the West, initially in Paris and finally the U.S., where he became a professor at the University of California at Berkeley in 1961—imagine, an anti-communist at Berkeley!

In this book, he explores the various ways in which the human soul comes to terms with a regime which denies its very existence. Four long chapters explore the careers of four Polish writers he denotes as “Alpha” through “Delta” and the choices they made when faced with a system which offered them substantial material rewards in return for conformity with a rigid system which put them at the service of the State, working toward ends prescribed by the “Center” (Moscow). He likens acceptance of this bargain to swallowing a mythical happiness pill, which, by eliminating the irritations of creativity, scepticism, and morality, guarantees those who take it a tranquil place in a well-ordered society. In a powerful chapter titled “Ketman”—a Persian word denoting fervent protestations of faith by nonbelievers, not only in the interest of self-preservation, but of feeling superior to those they so easily deceive—Milosz describes how an entire population can become actors who feign belief in an ideology and pretend to believe the earnest affirmations of orthodoxy on the part of others while sharing scorn for the few true believers.

The author received the 1980 Nobel Prize in Literature.

 Permalink

Hawkins, Jeff with Sandra Blakeslee. On Intelligence. New York: Times Books, 2004. ISBN 0-8050-7456-2.
Ever since the early days of research into the sub-topic of computer science which styles itself “artificial intelligence”, such work has been criticised by philosophers, biologists, and neuroscientists who argue that while symbolic manipulation, database retrieval, and logical computation may be able to mimic, to some limited extent, the behaviour of an intelligent being, in no case does the computer understand the problem it is solving in the sense a human does. John R. Searle's “Chinese Room” thought experiment is one of the best known and extensively debated of these criticisms, but there are many others just as cogent and difficult to refute.

These days, criticising artificial intelligence verges on hunting cows with a bazooka—unlike the early days in the 1950s when everybody expected the world chess championship to be held by a computer within five or ten years and mathematicians were fretting over what they'd do with their lives once computers learnt to discover and prove theorems thousands of times faster than they, decades of hype, fads, disappointment, and broken promises have instilled some sense of reality into the expectations most technical people have for “AI”, if not into those working in the field and those they bamboozle with the sixth (or is it the sixteenth) generation of AI bafflegab.

AI researchers sometimes defend their field by saying “If it works, it isn't AI”, by which they mean that as soon as a difficult problem once considered within the domain of artificial intelligence—optical character recognition, playing chess at the grandmaster level, recognising faces in a crowd—is solved, it's no longer considered AI but simply another computer application, leaving AI with the remaining unsolved problems. There is certainly some truth in this, but a closer look gives lie to the claim that these problems, solved with enormous effort on the part of numerous researchers, and with the application, in most cases, of computing power undreamed of in the early days of AI, actually represents “intelligence”, or at least what one regards as intelligent behaviour on the part of a living brain.

First of all, in no case did a computer “learn” how to solve these problems in the way a human or other organism does; in every case experts analysed the specific problem domain in great detail, developed special-purpose solutions tailored to the problem, and then implemented them on computing hardware which in no way resembles the human brain. Further, each of these “successes” of AI is useless outside its narrow scope of application: a chess-playing computer cannot read handwriting, a speech recognition program cannot identify faces, and a natural language query program cannot solve mathematical “word problems” which pose no difficulty to fourth graders. And while many of these programs are said to be “trained” by presenting them with collections of stimuli and desired responses, no amount of such training will permit, say, an optical character recognition program to learn to write limericks. Such programs can certainly be useful, but nothing other than the fact that they solve problems which were once considered difficult in an age when computers were much slower and had limited memory resources justifies calling them “intelligent”, and outside the marketing department, few people would remotely consider them so.

The subject of this ambitious book is not “artificial intelligence” but intelligence: the real thing, as manifested in the higher cognitive processes of the mammalian brain, embodied, by all the evidence, in the neocortex. One of the most fascinating things about the neocortex is how much a creature can do without one, for only mammals have them. Reptiles, birds, amphibians, fish, and even insects (which barely have a brain at all) exhibit complex behaviour, perception of and interaction with their environment, and adaptation to an extent which puts to shame the much-vaunted products of “artificial intelligence”, and yet they all do so without a neocortex at all. In this book, the author hypothesises that the neocortex evolved in mammals as an add-on to the old brain (essentially, what computer architects would call a “bag hanging on the side of the old machine”) which implements a multi-level hierarchical associative memory for patterns and a complementary decoder from patterns to detailed low-level behaviour which, wired through the old brain to the sensory inputs and motor controls, dynamically learns spatial and temporal patterns and uses them to make predictions which are fed back to the lower levels of the hierarchy, which in turns signals whether further inputs confirm or deny them. The ability of the high-level cortex to correctly predict inputs is what we call “understanding” and it is something which no computer program is presently capable of doing in the general case.

Much of the recent and present-day work in neuroscience has been devoted to imaging where the brain processes various kinds of information. While fascinating and useful, these investigations may overlook one of the most striking things about the neocortex: that almost every part of it, whether devoted to vision, hearing, touch, speech, or motion appears to have more or less the same structure. This observation, by Vernon B. Mountcastle in 1978, suggests there may be a common cortical algorithm by which all of these seemingly disparate forms of processing are done. Consider: by the time sensory inputs reach the brain, they are all in the form of spikes transmitted by neurons, and all outputs are sent in the same form, regardless of their ultimate effect. Further, evidence of plasticity in the cortex is abundant: in cases of damage, the brain seems to be able to re-wire itself to transfer a function to a different region of the cortex. In a long (70 page) chapter, the author presents a sketchy model of what such a common cortical algorithm might be, and how it may be implemented within the known physiological structure of the cortex.

The author is a founder of Palm Computing and Handspring (which was subsequently acquired by Palm). He subsequently founded the Redwood Neuroscience Institute, which has now become part of the Helen Wills Neuroscience Institute at the University of California, Berkeley, and in March of 2005 founded Numenta, Inc. with the goal of developing computer memory systems based on the model of the neocortex presented in this book.

Some academic scientists may sniff at the pretensions of a (very successful) entrepreneur diving into their speciality and trying to figure out how the brain works at a high level. But, hey, nobody else seems to be doing it—the computer scientists are hacking away at their monster programs and parallel machines, the brain community seems stuck on functional imaging (like trying to reverse-engineer a microprocessor in the nineteenth century by looking at its gross chemical and electrical properties), and the neuron experts are off dissecting squid: none of these seem likely to lead to an understanding (there's that word again!) of what's actually going on inside their own tenured, taxpayer-funded skulls. There is undoubtedly much that is wrong in the author's speculations, but then he admits that from the outset and, admirably, presents an appendix containing eleven testable predictions, each of which can falsify all or part of his theory. I've long suspected that intelligence has more to do with memory than computation, so I'll confess to being predisposed toward the arguments presented here, but I'd be surprised if any reader didn't find themselves thinking about their own thought processes in a different way after reading this book. You won't find the answers to the mysteries of the brain here, but at least you'll discover many of the questions worth pondering, and perhaps an idea or two worth exploring with the vast computing power at the disposal of individuals today and the boundless resources of data in all forms available on the Internet.

 Permalink