Medicine

Barry, John M. The Great Influenza. New York: Penguin, [2004] 2005. ISBN 978-0-14-303649-4.
In the year 1800, the practice of medicine had changed little from that in antiquity. The rapid progress in other sciences in the 18th century had had little impact on medicine, which one historian called “the withered arm of science”. This began to change as the 19th century progressed. Researchers, mostly in Europe and especially in Germany, began to lay the foundations for a scientific approach to medicine and public health, understanding the causes of disease and searching for means of prevention and cure. The invention of new instruments for medical examination, anesthesia, and antiseptic procedures began to transform the practice of medicine and surgery.

All of these advances were slow to arrive in the United States. As late as 1900 only one medical school in the U.S. required applicants to have a college degree, and only 20% of schools required a high school diploma. More than a hundred U.S. medical schools accepted any applicant who could pay, and many graduated doctors who had never seen a patient or done any laboratory work in science. In the 1870s, only 10% of the professors at Harvard's medical school had a Ph.D.

In 1873, Johns Hopkins died, leaving his estate of US$ 3.5 million to found a university and hospital. The trustees embarked on an ambitious plan to build a medical school to be the peer of those in Germany, and began to aggressively recruit European professors and Americans who had studied in Europe to build a world class institution. By the outbreak of World War I in Europe, American medical research and education, still concentrated in just a few centres of excellence, had reached the standard set by Germany. It was about to face its greatest challenge.

With the entry of the United States into World War I in April of 1917, millions of young men conscripted for service were packed into overcrowded camps for training and preparation for transport to Europe. These camps, thrown together on short notice, often had only rudimentary sanitation and shelter, with many troops living in tent cities. Large number of doctors and especially nurses were recruited into the Army, and by the start of 1918 many were already serving in France. Doctors remaining in private practice in the U.S. were often older men, trained before the revolution in medical education and ignorant of modern knowledge of diseases and the means of treating them.

In all American wars before World War I, more men died from disease than combat. In the Civil War, two men died from disease for every death on the battlefield. Army Surgeon General William Gorgas vowed that this would not be the case in the current conflict. He was acutely aware that the overcrowded camps, frequent transfers of soldiers among far-flung bases, crowded and unsanitary troop transport ships, and unspeakable conditions in the trenches were a tinderbox just waiting for the spark of an infectious disease to ignite it. But the demand for new troops for the front in France caused his cautions to be overruled, and still more men were packed into the camps.

Early in 1918, a doctor in rural Haskell County, Kansas began to treat patients with a disease he diagnosed as influenza. But this was nothing like the seasonal influenza with which he was familiar. In typical outbreaks of influenza, the people at greatest risk are the very young (whose immune systems have not been previously exposed to the virus) and the very old, who lack the physical resilience to withstand the assault by the disease. Most deaths are among these groups, leading to a “bathtub curve” of mortality. This outbreak was different: the young and elderly were largely spared, while those in the prime of life were struck down, with many dying quickly of symptoms which resembled pneumonia. Slowly the outbreak receded, and by mid-March things were returning to normal. (The location and mechanism where the disease originated remain controversial to this day and we may never know for sure. After weighing competing theories, the author believes the Kansas origin most likely, but other origins have their proponents.)

That would have been the end of it, had not soldiers from Camp Funston, the second largest Army camp in the U.S., with 56,000 troops, visited their families in Haskell County while on leave. They returned to camp carrying the disease. The spark had landed in the tinderbox. The disease spread outward as troop trains travelled between camps. Often a train would leave carrying healthy troops (infected but not yet symptomatic) and arrive with up to half the company sick and highly infectious to those at the destination. Before long the disease arrived via troop ships at camps and at the front in France.

This was just the first wave. The spring influenza was unusual in the age group it hit most severely, but was not particularly more deadly than typical annual outbreaks. Then in the fall a new form of the disease returned in a much more virulent form. It is theorised that under the chaotic conditions of wartime a mutant form of the virus had emerged and rapidly spread among the troops and then passed into the civilian population. The outbreak rapidly spread around the globe, and few regions escaped. It was particularly devastating to aboriginal populations in remote regions like the Arctic and Pacific islands who had not developed any immunity to influenza.

The pathogen in the second wave could kill directly within a day by destroying the lining of the lung and effectively suffocating the patient. The disease was so virulent and aggressive that some medical researchers doubted it was influenza at all and suspected some new kind of plague. Even those who recovered from the disease had much of their immunity and defences against respiratory infection so impaired that some people who felt well enough to return to work would quickly come down with a secondary infection of bacterial pneumonia which could kill them.

All of the resources of the new scientific medicine were thrown into the battle with the disease, with little or no impact upon its progression. The cause of influenza was not known at the time: some thought it was a bacterial disease while others suspected a virus. Further adding to the confusion is that influenza patients often had a secondary infection of bacterial pneumonia, and the organism which causes that disease was mis-identified as the pathogen responsible for influenza. Heroic efforts were made, but the state of medical science in 1918 was simply not up to the challenge posed by influenza.

A century later, influenza continues to defeat every attempt to prevent or cure it, and another global pandemic remains a distinct possibility. Supportive treatment in the developed world and the availability of antibiotics to prevent secondary infection by pneumonia will reduce the death toll, but a mass outbreak of the virus on the scale of 1918 would quickly swamp all available medical facilities and bring society to the brink as it did then. Even regular influenza kills between a quarter and a half million people a year. The emergence of a killer strain like that of 1918 could increase this number by a factor of ten or twenty.

Influenza is such a formidable opponent due to its structure. It is an RNA virus which, unusually for a virus, has not a single strand of genetic material but seven or eight separate strands of RNA. Some researchers argue that in an organism infected with two or more variants of the virus these strands can mix to form new mutants, allowing the virus to mutate much faster than other viruses with a single strand of genetic material (this is controversial). The virus particle is surrounded by proteins called hemagglutinin (HA) and neuraminidase (NA). HA allows the virus to break into a target cell, while NA allows viruses replicated within the cell to escape to infect others.

What makes creating a vaccine for influenza so difficult is that these HA and NA proteins are what the body's immune system uses to identify the virus as an invader and kill it. But HA and NA come in a number of variants, and a specific strain of influenza may contain one from column H and one from column N, creating a large number of possibilities. For example, H1N2 is endemic in birds, pigs, and humans. H5N1 caused the bird flu outbreak in 2004, and H1N1 was responsible for the 1918 pandemic. It gets worse. As a child, when you are first exposed to influenza, your immune system will produce antibodies which identify and target the variant to which you were first exposed. If you were infected with and recovered from, say, H3N2, you'll be pretty well protected against it. But if, subsequently, you encounter H1N1, your immune system will recognise it sufficiently to crank out antibodies, but they will be coded to attack H3N2, not the H1N1 you're battling, against which they're useless. Influenza is thus a chameleon, constantly changing its colours to hide from the immune system.

Strains of influenza tend to come in waves, with one HxNy variant dominating for some number of years, then shifting to another. Developers of vaccines must play a guessing game about which you're likely to encounter in a given year. This explains why the 1918 pandemic particularly hit healthy adults. Over the decades preceding the 1918 outbreak, the primary variant had shifted from H1N1, then decades of another variant, and then after 1900 H1N1 came back to the fore. Consequently, when the deadly strain of H1N1 appeared in the fall of 1918, the immune systems of both young and elderly people were ready for it and protected them, but those in between had immune systems which, when confronted with H1N1, produced antibodies for the other variant, leaving them vulnerable.

With no medical defence against or cure for influenza even today, the only effective response in the case of an outbreak of a killer strain is public health measures such as isolation and quarantine. Influenza is airborne and highly infectious: the gauze face masks you see in pictures from 1918 were almost completely ineffective. The government response to the outbreak in 1918 could hardly have been worse. After creating military camps which were nothing less than a culture medium containing those in the most vulnerable age range packed in close proximity, once the disease broke out and reports began to arrive that this was something new and extremely lethal, the troop trains and ships continued to run due to orders from the top that more and more men had to be fed into the meat grinder that was the Western Front. This inoculated camp after camp. Then, when the disease jumped into the civilian population and began to devastate cities adjacent to military facilities such as Boston and Philadelphia, the press censors of Wilson's proto-fascist war machine decided that honest reporting of the extent and severity of the disease or measures aimed at slowing its spread would impact “morale” and war production, so newspapers were ordered to either ignore it or print useless happy talk which only accelerated the epidemic. The result was that in the hardest-hit cities, residents confronted with the reality before their eyes giving to lie to the propaganda they were hearing from authorities retreated into fear and withdrawal, allowing neighbours to starve rather than risk infection by bringing them food.

As was known in antiquity, the only defence against an infectious disease with no known medical intervention is quarantine. In Western Samoa, the disease arrived in September 1918 on a German steamer. By the time the disease ran its course, 22% of the population of the islands was dead. Just a few kilometres across the ocean in American Samoa, authorities imposed a rigid quarantine and not a single person died of influenza.

We will never know the worldwide extent of the 1918 pandemic. Many of the hardest-hit areas, such as China and India, did not have the infrastructure to collect epidemiological data and what they had collapsed under the impact of the crisis. Estimates are that on the order of 500 million people worldwide were infected and that between 50 and 100 million died: three to five percent of the world's population.

Researchers do not know why the 1918 second wave pathogen was so lethal. The genome has been sequenced and nothing jumps out from it as an obvious cause. Understanding its virulence may require recreating the monster and experimenting with it in animal models. Obviously, this is not something which should be undertaken without serious deliberation beforehand and extreme precautions, but it may be the only way to gain the knowledge needed to treat those infected should a similar wild strain emerge in the future. (It is possible this work may have been done but not published because it could provide a roadmap for malefactors bent on creating a synthetic plague. If this be the case, we'll probably never know about it.)

Although medicine has made enormous strides in the last century, influenza, which defeated the world's best minds in 1918, remains a risk, and in a world with global air travel moving millions between dense population centres, an outbreak today would be even harder to contain. Let us hope that in that dire circumstance authorities will have the wisdom and courage to take the kind of dramatic action which can make the difference between a regional tragedy and a global cataclysm.

October 2014 Permalink

Carreyrou, John. Bad Blood. New York: Alfred A. Knopf, 2018. ISBN 978-1-9848-3363-1.
The drawing of blood for laboratory tests is one of my least favourite parts of a routine visit to the doctor's office. Now, I have no fear of needles and hardly notice the stick, but frequently the doctor's assistant who draws the blood (whom I've nicknamed Vampira) has difficulty finding the vein to get a good flow and has to try several times. On one occasion she made an internal puncture which resulted in a huge, ugly bruise that looked like I'd slammed a car door on my arm. I wondered why they need so much blood, and why draw it into so many different containers? (Eventually, I researched this, having been intrigued by the issue during the O. J. Simpson trial; if you're curious, here is the information.) Then, after the blood is drawn, it has to be sent off to the laboratory, which sends back the results days later. If something pops up in the test results, you have to go back for a second visit with the doctor to discuss it.

Wouldn't it be great if they could just stick a fingertip and draw a drop or two of blood, as is done by diabetics to test blood sugar, then run all the tests on it? Further, imagine if, after taking the drop of blood, it could be put into a desktop machine right in the doctor's office which would, in a matter of minutes, produce test results you could discuss immediately with the doctor. And if such a technology existed and followed the history of decline in price with increase in volume which has characterised other high technology products since the 1970s, it might be possible to deploy the machines into the homes of patients being treated with medications so their effects could be monitored and relayed directly to their physicians in case an anomaly was detected. It wouldn't quite be a Star Trek medical tricorder, but it would be one step closer. With the cost of medical care rising steeply, automating diagnostic blood tests and bringing them to the mass market seemed an excellent candidate as the “next big thing” for Silicon Valley to revolutionise.

This was the vision that came to 19 year old Elizabeth Holmes after completing a summer internship at the Genome Institute of Singapore after her freshman year as a chemical engineering major at Stanford. Holmes had decided on a career in entrepreneurship from an early age and, after her first semester told her father, “No, Dad, I'm, not interested in getting a Ph.D. I want to make money.” And Stanford, in the heart of Silicon Valley, was surrounded by companies started by professors and graduates who had turned inventions into vast fortunes. With only one year of college behind her, she was sure she'd found her opportunity. She showed the patent application she'd drafted for an arm patch that would diagnose medical conditions to Channing Robertson, professor of chemical engineering at Stanford, and Shaunak Roy, the Ph.D. student in whose lab she had worked as an assistant during her freshman year. Robertson was enthusiastic, and when Holmes said she intended to leave Stanford and start a company to commercialise the idea, he encouraged her. When the company was incorporated in 2004, Roy, then a newly-minted Ph.D., became its first employee and Robertson joined the board.

From the outset, the company was funded by other people's money. Holmes persuaded a family friend, Tim Draper, a second-generation venture capitalist who had backed, among other companies, Hotmail, to invest US$ 1 million in first round funding. Draper was soon joined by Victor Palmieri, a corporate turnaround artist and friend of Holmes' father. The company was named Theranos, from “therapy” and “diagnosis”. Elizabeth, unlike this scribbler, had a lifelong aversion to needles, and the invention she described in the business plan pitched to investors was informed by this. A skin patch would draw tiny quantities of blood without pain by means of “micro-needles”, the blood would be analysed by micro-miniaturised sensors in the patch and, if needed, medication could be injected. A wireless data link would send results to the doctor.

This concept, and Elizabeth's enthusiasm and high-energy pitch allowed her to recruit additional investors, raising almost US$ 6 million in 2004. But there were some who failed to be persuaded: MedVentures Associates, a firm that specialised in medical technology, turned her down after discovering she had no answers for the technical questions raised in a meeting with the partners, who had in-depth experience with diagnostic technology. This would be a harbinger of the company's fund-raising in the future: in its entire history, not a single venture fund or investor with experience in medical or diagnostic technology would put money into the company.

Shaunak Roy, who, unlike Holmes, actually knew something about chemistry, quickly realised that Elizabeth's concept, while appealing to the uninformed, was science fiction, not science, and no amount of arm-waving about nanotechnology, microfluidics, or laboratories on a chip would suffice to build something which was far beyond the state of the art. This led to a “de-scoping” of the company's ambition—the first of many which would happen over succeeding years. Instead of Elizabeth's magical patch, a small quantity of blood would be drawn from a finger stick and placed into a cartridge around the size of a credit card. The disposable cartridge would then be placed into a desktop “reader” machine, which would, using the blood and reagents stored in the cartridge, perform a series of analyses and report the results. This was originally called Theranos 1.0, but after a series of painful redesigns, was dubbed the “Edison”. This was the prototype Theranos ultimately showed to potential customers and prospective investors.

This was a far cry from the original ambitious concept. The hundreds of laboratory tests doctors can order are divided into four major categories: immunoassays, general chemistry, hæmatology, and DNA amplification. In immunoassay tests, blood plasma is exposed to an antibody that detects the presence of a substance in the plasma. The antibody contains a marker which can be detected by its effect on light passed through the sample. Immunoassays are used in a number of common blood tests, such the 25(OH)D assay used to test for vitamin D deficiency, but cannot perform other frequently ordered tests such as blood sugar and red and white blood cell counts. Edison could only perform what is called “chemiluminescent immunoassays”, and thus could only perform a fraction of the tests regularly ordered. The rationale for installing an Edison in the doctor's office was dramatically reduced if it could only do some tests but still required a venous blood draw be sent off to the laboratory for the balance.

This didn't deter Elizabeth, who combined her formidable salesmanship with arm-waving about the capabilities of the company's products. She was working on a deal to sell four hundred Edisons to the Mexican government to cope with an outbreak of swine flu, which would generate immediate revenue. Money was much on the minds of Theranos' senior management. By the end of 2009, the company had burned through the US$ 47 million raised in its first three rounds of funding and, without a viable product or prospects for sales, would have difficulty keeping the lights on.

But the real bonanza loomed on the horizon in 2010. Drugstore giant Walgreens was interested in expanding their retail business into the “wellness market”: providing in-store health services to their mass market clientèle. Theranos pitched them on offering in-store blood testing. Doctors could send their patients to the local Walgreens to have their blood tested from a simple finger stick and eliminate the need to draw blood in the office or deal with laboratories. With more than 8,000 locations in the U.S., if each were to be equipped with one Edison, the revenue to Theranos (including the single-use testing cartridges) would put them on the map as another Silicon Valley disruptor that went from zero to hundreds of millions in revenue overnight. But here, as well, the Elizabeth effect was in evidence. Of the 192 tests she told Walgreens Theranos could perform, fewer than half were immunoassays the Edisons could run. The rest could be done only on conventional laboratory equipment, and certainly not on a while-you-wait basis.

Walgreens wasn't the only potential saviour on the horizon. Grocery godzilla Safeway, struggling with sales and earnings which seemed to have reached a peak, saw in-store blood testing with Theranos machines as a high-margin profit centre. They loaned Theranos US$ 30 million and began to plan for installation of blood testing clinics in their stores.

But there was a problem, and as the months wore on, this became increasingly apparent to people at both Walgreens and Safeway, although dismissed by those in senior management under the spell of Elizabeth's reality distortion field. Deadlines were missed. Simple requests, such as A/B comparison tests run on the Theranos hardware and at conventional labs were first refused, then postponed, then run but results not disclosed. The list of tests which could be run, how blood for them would be drawn, and how they would be processed seemed to dissolve into fog whenever specific requests were made for this information, which was essential for planning the in-store clinics.

There was, indeed, a problem, and it was pretty severe, especially for a start-up which had burned through US$ 50 million and sold nothing. The product didn't work. Not only could the Edison only run a fraction of the tests its prospective customers had been led by Theranos to believe it could, for those it did run the results were wildly unreliable. The small quantity of blood used in the test introduced random errors due to dilution of the sample; the small tubes in the cartridge were prone to clogging; and capillary blood collected from a finger stick was prone to errors due to “hemolysis”, the rupture of red blood cells, which is minimal in a venous blood draw but so prevalent in finger stick blood it could lead to some tests producing values which indicated the patient was dead.

Meanwhile, people who came to work at Theranos quickly became aware that it was not a normal company, even by the eccentric standards of Silicon Valley. There was an obsession with security, with doors opened by badge readers; logging of employee movement; information restricted to narrow silos prohibiting collaboration between, say, engineering and marketing which is the norm in technological start-ups; monitoring of employee Internet access, E-mail, and social media presence; a security detail of menacing-looking people in black suits and earpieces (which eventually reached a total of twenty); a propensity of people, even senior executives, to “vanish”, Stalin-era purge-like, overnight; and a climate of fear that anybody, employee or former employee, who spoke about the company or its products to an outsider, especially the media, would be pursued, harassed, and bankrupted by lawsuits. There aren't many start-ups whose senior scientists are summarily demoted and subsequently commit suicide. That happened at Theranos. The company held no memorial for him.

Throughout all of this, a curious presence in the company was Ramesh (“Sunny”) Balwani, a Pakistani-born software engineer who had made a fortune of more than US$ 40 million in the dot-com boom and cashed out before the bust. He joined Theranos in late 2009 as Elizabeth's second in command and rapidly became known as a hatchet man, domineering boss, and clueless when it came to the company's key technologies (on one occasion, an engineer mentioned a robotic arm's “end effector”, after which Sunny would frequently speak of its “endofactor”). Unbeknownst to employees and investors, Elizabeth and Sunny had been living together since 2005. Such an arrangement would be a major scandal in a public company, but even in a private firm, concealing such information from the board and investors is a serious breach of trust.

Let's talk about the board, shall we? Elizabeth was not only persuasive, but well-connected. She would parley one connection into another, and before long had recruited many prominent figures including:

  • George Schultz (former U.S. Secretary of State)
  • Henry Kissinger (former U.S. Secretary of State)
  • Bill Frist (former U.S. Senator and medical doctor)
  • James Mattis (General, U.S. Marine Corps)
  • Riley Bechtel (Chairman and former CEO, Bechtel Group)
  • Sam Nunn (former U.S. Senator)
  • Richard Kobacevich (former Wells Fargo chairman and CEO)

Later, super-lawyer David Boies would join the board, and lead its attacks against the company's detractors. It is notable that, as with its investors, not a single board member had experience in medical or diagnostic technology. Bill Frist was an M.D., but his speciality was heart and lung transplants, not laboratory tests.

By 2014, Elizabeth Holmes had come onto the media radar. Photogenic, articulate, and with a story of high-tech disruption of an industry much in the news, she began to be featured as the “female Steve Jobs”, which must have pleased her, since she affected black turtlenecks, kale shakes, and even a car with no license plates to emulate her role model. She appeared on the cover of Fortune in January 2014, made the Forbes list of 400 most wealthy shortly thereafter, was featured in puff pieces in business and general market media, and was named by Time as one of the hundred most influential people in the world. The year 2014 closed with another glowing profile in the New Yorker. This would be the beginning of the end, as it happened to be read by somebody who actually knew something about blood testing.

Adam Clapper, a pathologist in Missouri, spent his spare time writing Pathology Blawg, with a readership of practising pathologists. Clapper read what Elizabeth was claiming to do with a couple of drops of blood from a finger stick and it didn't pass the sniff test. He wrote a sceptical piece on his blog and, as it passed from hand to hand, he became a lightning rod for others dubious of Theranos' claims, including those with direct or indirect experience with the company. Earlier, he had helped a Wall Street Journal reporter comprehend the tangled web of medical laboratory billing, and he decided to pass on the tip to the author of this book.

Thus began the unravelling of one of the greatest scams and scandals in the history of high technology, Silicon Valley, and venture investing. At the peak, privately-held Theranos was valued at around US$ 9 billion, with Elizabeth Holmes holding around half of its common stock, and with one of those innovative capital structures of which Silicon Valley is so fond, 99.7% of the voting rights. Altogether, over its history, the company raised around US$ 900 million from investors (including US$ 125 million from Rupert Murdoch in the US$ 430 million final round of funding). Most of the investors' money was ultimately spent on legal fees as the whole fairy castle crumbled.

The story of the decline and fall is gripping, involving the grandson of a Secretary of State, gumshoes following whistleblowers and reporters, what amounts to legal terrorism by the ever-slimy David Boies, courageous people who stood their ground in the interest of scientific integrity against enormous personal and financial pressure, and the saga of one of the most cunning and naturally talented confidence women ever, equipped with only two semesters of freshman chemical engineering, who managed to raise and blow through almost a billion dollars of other people's money without checking off the first box on the conventional start-up check list: “Build the product”.

I have, in my career, met three world-class con men. Three times, I (just barely) managed to pick up the warning signs and beg my associates to walk away. Each time I was ignored. After reading this book, I am absolutely sure that had Elizabeth Holmes pitched me on Theranos (about which I never heard before the fraud began to be exposed), I would have been taken in. Walker's law is “Absent evidence to the contrary, assume everything is a scam”. A corollary is “No matter how cautious you are, there's always a confidence man (or woman) who can scam you if you don't do your homework.”

Here is Elizabeth Holmes at Stanford in 2013, when Theranos was riding high and she was doing her “female Steve Jobs” act.

Elizabeth Holmes at Stanford: 2013

This is a CNN piece, filmed after the Theranos scam had begun to collapse, in which you can still glimpse the Elizabeth Holmes reality distortion field at full intensity directed at CNN medical correspondent Sanjay Gupta. There are several curious things about this video. The machine that Gupta is shown is the “miniLab”, a prototype second-generation machine which never worked acceptably, not the Edison, which was actually used in the Walgreens and Safeway tests. Gupta's blood is drawn and tested, but the process used to perform the test is never shown. The result reported is a cholesterol test, but the Edison cannot perform such tests. In the plans for the Walgreens and Safeway roll-outs, such tests were performed on purchased Siemens analysers which had been secretly hacked by Theranos to work with blood diluted well below their regulatory-approved specifications (the dilution was required due to the small volume of blood from the finger stick). Since the miniLab never really worked, the odds are that Gupta's blood was tested on one of the Siemens machines, not a Theranos product at all.

CNN: Inside the Theranos Lab (2016)

In a June 2018 interview, author John Carreyrou recounts the story of Theranos and his part in revealing the truth.

John Carreyrou on investigating Theranos (2018)

If you are a connoisseur of the art of the con, here is a masterpiece. After the Wall Street Journal exposé had broken, after retracting tens of thousands of blood tests, and after Theranos had been banned from running a clinical laboratory by its regulators, Holmes got up before an audience of 2500 people at the meeting of the American Association of Clinical Chemistry and turned up the reality distortion field to eleven. Watch a master at work. She comes on the stage at the six minute mark.

Elizabeth Holmes at the American Association of Clinical Chemistry (2016)

July 2018 Permalink

Cordain, Loren. The Paleo Diet. Hoboken, NJ: John Wiley & Sons, 2002. ISBN 978-0-470-91302-4.
As the author of a diet book, I don't read many self-described “diet books”. First of all, I'm satisfied with the approach to weight management described in my own book; second, I don't need to lose weight; and third, I find most “diet books” built around gimmicks with little justification in biology and prone to prescribe regimes that few people are likely to stick with long enough to achieve their goal. What motivated me to read this book was a talk by Michael Rose at the First Personalized Life Extension Conference in which he mentioned the concept and this book not in conjunction with weight reduction but rather the extension of healthy lifespan in humans. Rose's argument, which is grounded in evolutionary biology and paleoanthropology, is somewhat subtle and well summarised in this article.

At the core of Rose's argument and that of the present book is the observation that while the human genome is barely different from that of human hunter-gatherers a million years ago, our present-day population has had at most 200 to 500 generations to adapt to the very different diet which emerged with the introduction of agriculture and animal husbandry. From an evolutionary standpoint, this is a relatively short time for adaptation and, here is the key thing (argued by Rose, but not in this book), even if modern humans had evolved adaptations to the agricultural diet (as in some cases they clearly have, lactose tolerance persisting into adulthood being one obvious example), those adaptations will not, from the simple mechanism of evolution, select out diseases caused by the new diet which only manifest themselves after the age of last reproduction in the population. So, if eating the agricultural diet (not to mention the horrors we've invented in the last century) were the cause of late-onset diseases such as cancer, cardiovascular problems, and type 2 diabetes, then evolution would have done nothing to select out the genes responsible for them, since these diseases strike most people after the age at which they've already passed on their genes to their children. Consequently, while it may be fine for young people to eat grain, dairy products, and other agricultural era innovations, folks over the age of forty may be asking for trouble by consuming foods which evolution hasn't had the chance to mold their genomes to tolerate. People whose ancestors shifted to the agricultural lifestyle much more recently, including many of African and aboriginal descent, have little or no adaptation to the agricultural diet, and may experience problems even earlier in life.

In this book, the author doesn't make these fine distinctions but rather argues that everybody can benefit from a diet resembling that which the vast majority of our ancestors—hunter-gatherers predating the advent of sedentary agriculture—ate, and to which evolution has molded our genome over that long expanse of time. This is not a “diet book” in the sense of a rigid plan for losing weight. Instead, it is a manual for adopting a lifestyle, based entirely upon non-exotic foods readily available at the supermarket, which approximates the mix of nutrients consumed by our distant ancestors. There are the usual meal plans and recipes, but the bulk of the book is a thorough survey, with extensive citations to the scientific literature, of what hunter-gatherers actually ate, the links scientists have found between the composition of the modern diet and the emergence of “diseases of civilisation” among populations that have transitioned to it in historical times, and the evidence for specific deleterious effects of major components of the modern diet such as grains and dairy products.

Not to over-simplify, but you can go a long way toward the ancestral diet simply by going to the store with an “anti-shopping list” of things not to buy, principally:

  • Grain, or anything derived from grains (bread, pasta, rice, corn)
  • Dairy products (milk, cheese, butter)
  • Fatty meats (bacon, marbled beef)
  • Starchy tuber crops (potatoes, sweet potatoes)
  • Salt or processed foods with added salt
  • Refined sugar or processed foods with added sugar
  • Oils with a high omega 6 to omega 3 ratio (safflower, peanut)

And basically, that's it! Apart from the list above you can buy whatever you want, eat it whenever you like in whatever quantity you wish, and the author asserts that if you're overweight you'll soon see your weight dropping toward your optimal weight, a variety of digestive and other problems will begin to clear up, you'll have more energy and a more consistent energy level throughout the day, and that you'll sleep better. Oh, and your chances of contracting cancer, diabetes, or cardiovascular disease will be dramatically reduced.

In practise, this means eating a lot of lean meat, seafood, fresh fruit and fresh vegetables, and nuts. As the author points out, even if you have a mound of cooked boneless chicken breasts, broccoli, and apples on the table before you, you're far less likely to pig out on them compared to, say, a pile of doughnuts, because the natural foods don't give you the immediate blood sugar hit the highly glycemic processed food does. And even if you do overindulge, the caloric density in the natural foods is so much lower your jaw will get tired chewing or your gut will bust before you can go way over your calorie requirements.

Now, if even if the science is sound (there are hundreds of citations of peer reviewed publications in the bibliography, but then nutritionists are forever publishing contradictory “studies” on any topic you can imagine, and in any case epidemiology cannot establish causation) and the benefits from adopting this diet are as immediate, dramatic, and important for long-term health, a lot of people are going to have trouble with what is recommended here. Food is a lot more to humans and other species (as anybody who's had a “picky eater” cat can testify) than just molecular fuel and construction material for our bodies. Our meals nourish the soul as well as the body, and among humans shared meals are a fundamental part of our social interaction which evolution has doubtless had time to write into our genes. If you go back and look at that list of things not to eat, you'll probably discover that just about any “comfort food” you cherish probably runs afoul of one or more of the forbidden ingredients. This means that contemplating the adoption of this diet as a permanent lifestyle change can look pretty grim, unless or until you find suitable replacements that thread among the constraints. The recipes presented here are interesting, but still come across to me (not having tried them) as pretty Spartan. And recall that even Spartans lived a pretty sybaritic lifestyle compared to your average hunter-gatherer band. But, hey, peach fuzz is entirely cool!

The view of the mechanics of weight loss and gain and the interaction between exercise and weight reduction presented here is essentially 100% compatible with my own in The Hacker's Diet.

This was intriguing enough that I decided to give it a try starting a couple of weeks ago. (I have been adhering, more or less, to the food selection guidelines, but not the detailed meal plans.) The results so far are intriguing but, at this early date, inconclusive. The most dramatic effect was an almost immediate (within the first three days) crash in my always-pesky high blood pressure. This may be due entirely to putting away the salt shaker (an implement of which I have been inordinately fond since childhood), but whatever the cause, it's taken about 20 points off the systolic and 10 off the diastolic, throughout the day. Second, I've seen a consistent downward bias in my weight. Now, as I said, I didn't try this diet to lose weight (although I could drop a few kilos and still be within the target band for my height and build, and wouldn't mind doing so). In any case, these are short-term results and may include transient adaptation effects. I haven't been hungry for a moment nor have I experienced any specific cravings (except the second-order kind for popcorn with a movie). It remains to be seen what will happen when I next attend a Swiss party and have to explain that I don't eat cheese.

This is a very interesting nutritional thesis, backed by a wealth of impressive research of which I was previously unaware. It flies in the face of much of the conventional wisdom on diet and nutrition, and yet viewed from the standpoint of evolution, it makes a lot of sense. You will find the case persuasively put here and perhaps be tempted to give it a try.

December 2010 Permalink

De Vany, Arthur. The New Evolution Diet. New York: Rodale Books, 2011. ISBN 978-1-60529-183-3.
The author is an economist best known for his research into the economics of Hollywood films, and his demonstration that the Pareto distribution applies to the profitability of Hollywood productions, empirically falsifying many entertainment business nostrums about a correlation between production cost and “star power” of the cast and actual performance at the box office. When his son, and later his wife, developed diabetes and the medical consensus treatment seemed to send both into a downward spiral, his economist's sense for the behaviour of complex nonlinear systems with feedback and delays caused him to suspect that the regimen prescribed for diabetics was based on a simplistic view of the system aimed at treating the symptoms rather than the cause. This led him to an in depth investigation of human metabolism and nutrition, grounded in the evolutionary heritage of our species (this is fully documented here—indeed, almost half of the book is end notes and source references, which should not be neglected: there is much of interest there).

His conclusion was that our genes, which have scarcely changed in the last 40,000 years, were adapted to the hunter-gatherer lifestyle that our hominid ancestors lived for millions of years before the advent of agriculture. Our present day diet and way of life could not be more at variance with our genetic programming, so it shouldn't be a surprise that we see a variety of syndromes, including obesity, cardiovascular diseases, type 2 diabetes, and late-onset diseases such as many forms of cancer which are extremely rare among populations whose diet and lifestyle remain closer to those of ancestral humans. Strong evidence for this hypothesis comes from nomadic aboriginal populations which, settled into villages and transitioned to the agricultural diet, promptly manifested diseases, categorised as “metabolic syndrome”, which were previously unknown among them.

This is very much the same conclusion as that of The Paleo Diet (December 2010), and I recommend you read both of these books as they complement one another. The present volume goes deeper into the biochemistry underlying its dietary recommendations, and explores what the hunter-gatherer lifestyle has to say about the exercise to which we are adapted. Our ancestors' lives were highly chaotic: they ate when they made a kill or found food to gather and fasted until the next bounty. They engaged in intense physical exertion during a hunt or battle, and then passively rested until the next time. Modern times have made us slaves to the clock: we do the same things at the same times on a regular schedule. Even those who incorporate strenuous exercise into their routine tend to do the same things at the same time on the same days. The author argues that this is not remotely what our heritage has evolved us for.

Once Pareto gets into your head, it's hard to get him out. Most approaches to diet, nutrition, and exercise (including my own) view the human body as a system near equilibrium. The author argues that one shouldn't look at the mean but rather the kurtosis of the distribution, as it's the extremes that matter—don't tediously “do cardio” like all of the treadmill trudgers at the gym, but rather push your car up a hill every now and then, or randomly raise your heart rate into the red zone.

This all makes perfect sense to me. I happened to finish this book almost precisely six months after adopting my own version of the paleo diet, not from a desire to lose weight (I'm entirely happy with my weight, which hasn't varied much in the last twenty years, thanks to the feedback mechanism of The Hacker's Diet) but due to the argument that it averts late-onset diseases and extends healthy lifespan. Well, it's too early to form any conclusions on either of these, and in any case you can draw any curve you like through a sample size of one, but after half a year on paleo I can report that my weight is stable, my blood pressure is right in the middle of the green zone (as opposed to low-yellow before), I have more energy, sleep better, and have seen essentially all of the aches and pains and other symptoms of low-level inflammation disappear. Will you have cravings for things you've forgone when you transition to paleo? Absolutely—in my experience it takes about three months for them to go away. When I stopped salting my food, everything tasted like reprocessed blaah for the first couple of weeks, but now I appreciate the flavours below the salt.

For the time being, I'm going to continue this paleo thing, not primarily due to the biochemical and epidemiological arguments here, but because I've been doing it for six months and I feel better than I have for years. I am a creature of habit, and I find it very difficult to introduce kurtosis into my lifestyle: when exogenous events do so, I deem it an “entropic storm”. When it's 15:00, I go for my one hour walk. When it's 18:00, I eat, etc. Maybe I should find some way to introduce randomness into my life….

An excellent Kindle edition is available, with the table of contents, notes, and index all properly linked to the text.

June 2011 Permalink

Dworkin, Ronald W. Artificial Happiness. New York: Carroll & Graf, 2006. ISBN 0-7867-1714-9.
Western societies, with the United States in the lead, appear to be embarked on a grand scale social engineering experiment with little consideration of the potentially disastrous consequences both for individuals and the society at large. Over the last two decades “minor depression”, often no more than what, in less clinical nomenclature one would term unhappiness, has become seen as a medical condition treatable with pharmaceuticals, and prescription of these medications, mostly by general practitioners, not psychiatrists or psychologists, has skyrocketed, with drugs such as Prozac, Paxil, and Zoloft regularly appearing on lists of the most frequently prescribed. Tens of million of people in the United States take these pills, which are being prescribed to children and adolescents as well as adults.

Now, there's no question that these medications have been a Godsend for individuals suffering from severe clinical depression, which is now understood in many cases to be an organic disease caused by imbalances in the metabolism of neurotransmitters in the brain. But this vast public health experiment in medicating unhappiness is another thing altogether. Unhappiness, like pain, is a signal that something's wrong, and a motivator to change things for the better. But if unhappiness is seen as a disease which is treated by swallowing pills, this signal is removed, and people are numbed or stupefied out of taking action to eliminate the cause of their unhappiness: changing jobs or careers, reducing stress, escaping from abusive personal relationships, or embarking on some activity which they find personally rewarding. Self esteem used to be thought of as something you earned from accomplishing difficult things; once it becomes a state of mind you get from a bottle of pills, then what will become of all the accomplishments the happily medicated no longer feel motivated to achieve?

These are serious questions, and deserve serious investigation and a book-length treatment of the contemporary scene and trends. This is not, however, that book. The author is an M.D. anæsthesiologist with a Ph.D. in political philosophy from Johns Hopkins University, and a senior fellow at the Hudson Institute—impressive credentials. Notwithstanding them, the present work reads like something written by somebody who learned Marxism from a comic book. Individuals, entire professions, and groups as heterogeneous as clergy of organised religions are portrayed like cardboard cutouts—with stick figures drawn on them—in crayon. Each group the author identifies is seen as acting monolithically toward a specific goal, which is always nefarious in some way, advancing an agenda based solely on its own interest. The possibility that a family doctor might prescribe antidepressants for an unhappy patient in the belief that he or she is solving a problem for the patient is scarcely considered. No, the doctor is part of a grand conspiracy of “primary care physicians” advancing an agenda to usurp the “turf” (a term he uses incessantly) of first psychiatrists, and finally organised religion.

After reading this entire book, I still can't decide whether the author is really as stupid as he seems, or simply writes so poorly that he comes across that way. Each chapter starts out lurching toward a goal, loses its way and rambles off in various directions until the requisite number of pages have been filled, and then states a conclusion which is not justified by the content of the chapter. There are few cliches in the English language which are not used here—again and again. Here is an example of one of hundreds of paragraphs to which the only rational reaction is “Huh?”.

So long as spirituality was an idea, such as believing in God, it fell under religious control. However, if doctors redefined spirituality to mean a sensual phenomenon—a feeling—then doctors would control it, since feelings had long since passed into the medical profession's hands, the best example being unhappiness. Turning spirituality into a feeling would also help doctors square the phenomenon with their own ideology. If spirituality were redefined to mean a feeling rather than an idea, then doctors could group spirituality with all the other feelings, including unhappiness, thereby preserving their ideology's integrity. Spirituality, like unhappiness, would become a problem of neurotransmitters and a subclause of their ideology. (Page 226.)
A reader opening this book is confronted with 293 pages of this. This paragraph appears in chapter nine, “The Last Battle”, which describes the Manichean struggle between doctors and organised religion in the 1990s for the custody of the souls of Americans, ending in a total rout of religion. Oh, you missed that? Me too.

Mass medication with psychotropic drugs is a topic which cries out for a statistical examination of its public health dimensions, but Dworkin relates only anecdotes of individuals he has known personally, all of whose minds he seems to be able to read, diagnosing their true motivations which even they don't perceive, and discerning their true destiny in life, which he believes they are failing to follow due to medication for unhappiness.

And if things weren't muddled enough, he drags in “alternative medicine” (the modern, polite term for what used to be called “quackery”) and ”obsessive exercise” as other sources of Artificial Happiness (which he capitalises everywhere), which is rather odd since he doesn't believe either works except through the placebo effect. Isn't it just a little bit possible that some of those people working out at the gym are doing so because it makes them feel better and likely to live longer? Dworkin tries to envision the future for the Happy American, decoupled from the traditional trajectory through life by the ability to experience chemically induced happiness at any stage. Here, he seems to simultaneously admire and ridicule the culture of the 1950s, of which his knowledge seems to be drawn from re-runs of “Leave it to Beaver”. In the conclusion, he modestly proposes a solution to the problem which requires completely restructuring medical education for general practitioners and redefining the mission of all organised religions. At least he doesn't seem to have a problem with self-esteem!

October 2006 Permalink

Johnson, Steven. The Ghost Map. New York: Riverhead Books, 2006. ISBN 1-59448-925-4.
From the dawn of human civilisation until sometime in the nineteenth century, cities were net population sinks—the increased mortality from infectious diseases, compounded by the unsanitary conditions, impure water, and food transported from the hinterland and stored without refrigeration so shortened the lives of city-dwellers (except for the ruling class and the wealthy, a small fraction of the population) that a city's population was maintained only by a constant net migration to it from the countryside. In densely-packed cities, not only does an infected individual come into contact with many more potential victims than in a rural environment, highly virulent strains of infectious agents which would “burn out” due to rapidly killing their hosts in farm country or a small village can prosper in a city, since each infected host still has the opportunity to infect many others before succumbing. Cities can be thought of as Petri dishes for evolving killer microbes.

No civic culture medium was as hospitable to pathogens as London in the middle of the 19th century. Its population, 2.4 million in 1851, had exploded from just one million at the start of the century, and all of these people had been accommodated in a sprawling metropolis almost devoid of what we would consider a public health infrastructure. Sewers, where they existed, were often open and simply dumped into the Thames, whence other Londoners drew their drinking water, downstream. Other residences dumped human waste in cesspools, emptied occasionally (or maybe not) by “night-soil men”. Imperial London was a smelly, and a deadly place. Observing it first-hand is what motivated Friedrich Engels to document and deplore The Condition of the Working Class in England (January 2003).

Among the diseases which cut down inhabitants of cities, one of the most feared was cholera. In 1849, an outbreak killed 14,137 in London, and nobody knew when or where it might strike next. The prevailing theory of disease at this epoch was that infection was caused by and spread through “miasma”: contaminated air. Given how London stank and how deadly it was to its inhabitants, this would have seemed perfectly plausible to people living before the germ theory of disease was propounded. Edwin Chadwick, head of the General Board of Health in London at the epoch, went so far as to assert (p. 114) “all smell is disease”. Chadwick was, in many ways, one of the first advocates and implementers of what we have come to call “big government”—that the state should take an active role in addressing social problems and providing infrastructure for public health. Relying upon the accepted “miasma” theory and empowered by an act of Parliament, he spent the 1840s trying to eliminate the stink of the cesspools by connecting them to sewers which drained their offal into the Thames. Chadwick was, by doing so, to provide one of the first demonstrations of that universal concomitant of big government, unintended consequences: “The first defining act of a modern, centralized public-health authority was to poison an entire urban population.” (p. 120).

When, in 1854, a singularly virulent outbreak of cholera struck the Soho district of London, physician and pioneer in anæsthesia John Snow found himself at the fulcrum of a revolution in science and public health toward which he had been working for years. Based upon his studies of the 1849 cholera outbreak, Snow had become convinced that the pathogen spread through contamination of water supplies by the excrement of infected individuals. He had published a monograph laying out this theory in 1849, but it swayed few readers from the prevailing miasma theory. He was continuing to document the case when cholera exploded in his own neighbourhood. Snow's mind was not only prepared to consider a waterborne infection vector, he was also one of the pioneers of the emerging science of epidemiology: he was a founding member of the London Epidemiological Society in 1850. Snow's real-time analysis of the epidemic caused him to believe that the vector of infection was contaminated water from the Broad Street pump, and his persuasive presentation of the evidence to the Board of Governors of St. James Parish caused them to remove the handle from that pump, after which the contagion abated. (As the author explains, the outbreak was already declining at the time, and in all probability the water from the Broad Street pump was no longer contaminated then. However, due to subsequent events and discoveries made later, had the handle not been removed there would have likely been a second wave of the epidemic, with casualties comparable to the first.)

Afterward, Snow, with the assistance of initially-sceptical clergyman Henry Whitehead, whose intimate knowledge of the neighbourhood and its residents allowed compiling the data which not only confirmed Snow's hypothesis but identified what modern epidemiologists would call the “index case” and “vector of contagion”, revised his monograph to cover the 1854 outbreak, illustrated by a map which illustrated its casualties that has become a classic of on-the-ground epidemiology and the graphical presentation of data. Most brilliant was Snow's use (and apparent independent invention) of a Voronoi diagram to show the boundary, by streets, of the distance, not in Euclidean space, but by walking time, of the area closer to the Broad Street pump than to others in the neighbourhood. (Oddly, the complete map with this crucial detail does not appear in the book: only a blow-up of the central section without the boundary. The full map is here; depending on your browser, you may have to click on the map image to display it at full resolution. The dotted and dashed line is the Voronoi cell enclosing the Broad Street pump.)

In the following years, London embarked upon a massive program to build underground sewers to transport the waste of its millions of residents downstream to the tidal zone of the Thames and later, directly to the sea. There would be one more cholera outbreak in London in 1866—in an area not yet connected to the new sewers and water treatment systems. Afterward, there has not been a single epidemic of cholera in London. Other cities in the developed world learned this lesson and built the infrastructure to provide their residents clean water. In the developing world, cholera continues to take its toll: in the 1990s an outbreak in South America infected more than a million people and killed almost 10,000. Fortunately, administration of rehydration therapy (with electrolytes) has drastically reduced the likelihood of death from a cholera infection. Still, you have to wonder why, in a world where billions of people lack access to clean water and third world mega-cities are drawing millions to live in conditions not unlike London in the 1850s, that some believe that laptop computers are the top priority for children growing up there.

A paperback edition is now available.

December 2007 Permalink