- Taleb, Nassim Nicholas.
Skin in the Game.
New York: Random House, 2018.
ISBN 978-0-425-28462-9.
-
This book is volume four in the author's
Incerto series, following
Fooled by Randomness (February 2011),
The Black Swan (January 2009),
and Antifragile (April 2018).
In it, he continues to explore the topics of uncertainty, risk,
decision making under such circumstances, and how both
individuals and societies winnow out what works from what
doesn't in order to choose wisely among the myriad alternatives
available.
The title, “Skin in the Game”, is an aphorism which
refers to an individual's sharing the risks and rewards of an
undertaking in which they are involved. This is often applied
to business and finance, but it is, as the author demonstrates,
a very general and powerful concept. An airline pilot has skin
in the game along with the passengers. If the plane crashes and
kills everybody on board, the pilot will die along with them.
This insures that the pilot shares the passengers' desire for a
safe, uneventful trip and inspires confidence among them. A
government “expert” putting together a “food
pyramid” to be vigorously promoted among the citizenry and
enforced upon captive populations such as school children or
members of the armed forces, has no skin in the game. If his or
her recommendations create an epidemic of obesity, type 2
diabetes, and cardiovascular disease, that probably won't happen
until after the “expert” has retired and, in any
case, civil servants are not fired or demoted based upon the
consequences of their recommendations.
Ancestral human society was all about skin in the game. In a
small band of hunter/gatherers, everybody can see and is aware
of the actions of everybody else. Slackers who do not
contribute to the food supply are likely to be cut loose to fend
for themselves. When the hunt fails, nobody eats until the next
kill. If a conflict develops with a neighbouring band, those
who decide to fight instead of running away or surrendering are
in the front line of the battle and will be the first to suffer
in case of defeat.
Nowadays we are far more “advanced”. As the author
notes, “Bureaucracy is a construction by which a person is
conveniently separated from the consequences of his or her
actions.” As populations have exploded, layers and layers
of complexity have been erected, removing authority ever farther
from those under its power. We have built mechanisms which have
immunised a ruling class of decision makers from the
consequences of their decisions: they have little or no skin in
the game.
Less than a third of all Roman emperors died in their beds. Even
though they were at the pinnacle of the largest and most
complicated empire in the West, they regularly paid the ultimate
price for their errors either in battle or through palace
intrigue by those dissatisfied with their performance. Today the
geniuses responsible for the 2008 financial crisis, which
destroyed the savings of hundreds of millions of innocent people
and picked the pockets of blameless taxpayers to bail out the
institutions they wrecked, not only suffered no punishment of
any kind, but in many cases walked away with large bonuses or
golden parachute payments and today are listened to when they
pontificate on the current scene, rather than being laughed at
or scorned as they would be in a rational world. We have
developed institutions which shift the consequences of bad
decisions from those who make them to others, breaking the vital
feedback loop by which we converge upon solutions which, if not
perfect, at least work well enough to get the job done without
the repeated catastrophes that result from ivory tower theories
being implemented on a grand scale in the real world.
Learning and Evolution
Being creatures who have evolved large brains, we're inclined to
think that learning is something that individuals do, by
observing the world, drawing inferences, testing hypotheses, and
taking on knowledge accumulated by others. But the overwhelming
majority of creatures who have ever lived, and of those alive
today, do not have large brains—indeed, many do not have
brains at all. How have they learned to survive and
proliferate, filling every niche on the planet where
environmental conditions are compatible with biochemistry based
upon carbon atoms and water? How have they, over the billions
of years since life arose on Earth, inexorably increased in
complexity, most recently producing a species with a big brain
able to ponder such questions?
The answer is massive parallelism, exhaustive search, selection
for survivors, and skin in the game, or, putting it all
together, evolution. Every living creature has skin in the
ultimate game of whether it will produce offspring that inherit
its characteristics. Every individual is different, and the
process of reproduction introduces small variations in progeny.
Change the environment, and the characteristics of those best
adapted to reproduce in it will shift and, eventually, the
population will consist of organisms adapted to the new
circumstances. The critical thing to note is that while each
organism has skin in the game, many may, and indeed must, lose
the game and die before reproducing. The individual organism
does not learn, but the species does and, stepping back
another level, the ecosystem as a whole learns and adapts as
species appear, compete, die out, or succeed and proliferate.
This simple process has produced all of the complexity we
observe in the natural world, and it works because every
organism and species has skin in the game: its adaptation to its
environment has immediate consequences for its survival.
None of this is controversial or new. What the author has done
in this book is to apply this evolutionary epistemology
to domains far beyond its origins in biology—in fact, to
almost everything in the human experience—and demonstrate
that both success and wisdom are generated when this process is
allowed to work, but failure and folly result when it is
thwarted by institutions which take the skin out of the game.
How does this apply in present-day human society? Consider one
small example of a free market in action. The restaurant
business is notoriously risky. Restaurants come and go all the
time, and most innovations in the business fall flat on their
face and quickly disappear. And yet most cities have, at any
given time, a broad selection of restaurants with a wide variety
of menus, price points, ambiance, and service to appeal to
almost any taste. Each restaurant has skin in the game: those
which do not attract sufficient customers (or, having once been
successful, fail to adapt when customers' tastes change) go out
of business and are replaced by new entrants. And yet for all
the churning and risk to individual restaurants, the restaurant
“ecosystem” is remarkably stable, providing
customers options closely aligned with their current desires.
To a certain kind of “expert” endowed with a big
brain (often crammed into a pointy head), found in abundance
around élite universities and government agencies, all of
this seems messy, chaotic, and (the horror!)
inefficient. Consider the money lost when a restaurant
fails, the cooks and waiters who lose their jobs, having to find
a new restaurant to employ them, the vacant building earning
nothing for its owner until a new tenant is
found—certainly there must be a better way. Why, suppose
instead we design a standardised set of restaurants
based upon a careful study of public preferences, then roll out
this highly-optimised solution to the problem. They might be
called “public feeding centres”. And they would work
about as well as the name implies.
Survival and Extinction
Evolution ultimately works through extinction. Individuals who
are poorly adapted to their environment (or, in a free market,
companies which poorly serve their customers) fail to reproduce
(or, in the case of a company, survive and expand). This leaves
a population better adapted to its environment. When the
environment changes, or a new innovation appears (for example,
electricity in an age dominated by steam power), a new sorting
out occurs which may see the disappearance of long-established
companies that failed to adapt to the new circumstances. It is
a tautology that the current population consists entirely of
survivors, but there is a deep truth within this observation
which is at the heart of evolution. As long as there is a
direct link between performance in the real world and
survival—skin in the game—evolution will work to
continually optimise and refine the population as circumstances
change.
This evolutionary process works just as powerfully in the realm
of ideas as in biology and commerce. Ideas have consequences,
and for the process of selection to function, those
consequences, good or ill, must be borne by those who promulgate
the idea. Consider inventions: an inventor who creates
something genuinely useful and brings it to market (recognising
that there are many possible missteps and opportunities for bad
luck or timing to disrupt this process) may reap great rewards
which, in turn, will fund elaboration of the original invention
and development of related innovations. The new invention may
displace existing technologies and cause them, and those who
produce them, to become obsolete and disappear (or be relegated
to a minor position in the market). Both the winner and loser
in this process have skin in the game, and the outcome of the
game is decided by the evaluation of the customers expressed in
the most tangible way possible: what they choose to buy.
Now consider an academic theorist who comes up with some
intellectual “innovation” such as
“Modern
Monetary Theory” (which basically says that a
government can print as much paper money as it wishes to pay for
what it wants without collecting taxes or issuing debt as long
as full employment has not been achieved). The theory and the
reputation of those who advocate it are evaluated by their
peers: other academics and theorists employed by institutions
such as national treasuries and central banks. Such a theory is
not launched into a market to fend for itself among competing
theories: it is “sold” to those in positions of
authority and imposed from the top down upon an economy,
regardless of the opinions of those participating in it. Now,
suppose the brilliant new idea is implemented and results in,
say, total collapse of the economy and civil
society? What price do those who promulgated the theory and
implemented it pay? Little or nothing, compared to the misery
of those who lost their savings, jobs, houses, and assets in the
calamity. Many of the academics will have tenure and suffer no
consequences whatsoever: they will refine the theory, or else
publish erudite analyses of how the implementation was flawed
and argue that the theory “has never been tried”.
Some senior officials may be replaced, but will doubtless land
on their feet and continue to pull down large salaries as
lobbyists, consultants, or pundits. The bureaucrats who
patiently implemented the disastrous policies are civil
servants: their jobs and pensions are as eternal as anything in
this mortal sphere. And, before long, another bright, new idea
will bubble forth from the groves of academe.
(If you think this hypothetical example is unrealistic, see the
career of one
Robert Rubin.
“Bob”, during his association with Citigroup between
1999 and 2009, received total compensation of US$126 million for
his “services” as a director, advisor, and temporary
chairman of the bank, during which time he advocated the
policies which eventually brought it to the brink of collapse in
2008 and vigorously fought attempts to regulate the financial
derivatives which eventually triggered the global catastrophe.
During his tenure at Citigroup, shareholders of its stock lost
70% of their investment, and eventually the bank was bailed out
by the federal government using money taken by coercive taxation
from cab drivers and hairdressers who had no culpability in
creating the problems. Rubin walked away with his
“winnings” and paid no price, financial, civil, or
criminal, for his actions. He is one of the many poster boys
and girls for the “no skin in the game club”. And
lest you think that, chastened, the academics and pointy-heads
in government would regain their grounding in reality, I have
just one phrase for you,
“trillion
dollar coin”, which “Nobel Prize” winner
Paul Krugman declared to be “the most important fiscal
policy debate of our lifetimes”.)
Intellectual Yet Idiot
A cornerstone of civilised society, dating from at least the
Code
of Hammurabi (c. 1754 B.C.), is
that those who create risks must bear those risks: an architect
whose building collapses and kills its owner is put to death.
This is the fundamental feedback loop which enables learning.
When it is broken, when those who create risks (academics,
government policy makers, managers of large corporations, etc.)
are able to transfer those risks to others (taxpayers, those
subject to laws and regulations, customers, or the public at
large), the system does not learn; evolution breaks down; and
folly runs rampant. This phenomenon is manifested most
obviously in the modern proliferation of the affliction the
author calls the “intellectual yet idiot” (IYI).
These are people who are evaluated by their peers (other IYIs),
not tested against the real world. They are the equivalent of a
list of movies chosen based upon the opinions of high-falutin'
snobbish critics as opposed to box office receipts. They strive
for the approval of others like themselves and, inevitably,
spiral into ever more abstract theories disconnected from ground
truth, ascending ever higher into the sky.
Many IYIs achieve distinction in one narrow field and then
assume that qualifies them to pronounce authoritatively on any
topic whatsoever. As was said by biographer Roy Harrod of John
Maynard Keynes,
He held forth on a great range of topics, on some of which
he was thoroughly expert, but on others of which he may have
derived his views from the few pages of a book at which he
happened to glance. The air of authority was the same in
both cases.
Still other IYIs have no authentic credentials whatsoever, but
derive their purported authority from the approbation of other
IYIs in completely bogus fields such as gender and ethnic
studies, critical anything studies, and nutrition science. As
the author notes, riding some of his favourite hobby horses,
Typically, the IYI get first-order logic right, but not
second-order (or higher) effects, making him totally
incompetent in complex domains.
The IYI has been wrong, historically, about Stalinism,
Maoism, Iraq, Libya, Syria, lobotomies, urban planning,
low-carbohydrate diets, gym machines, behaviorism,
trans-fats, Freudianism, portfolio theory, linear
regression, HFCS (High-Fructose Corn Syrup), Gaussianism,
Salafism, dynamic stochastic equilibrium modeling, housing
projects, marathon running, selfish genes,
election-forecasting models, Bernie Madoff (pre-blowup), and
p values. But he is still convinced his current
position is right.
Doubtless, IYIs have always been with us (at least since
societies developed to such a degree that they could afford some
fraction of the population who devoted themselves entirely to
words and ideas)—Nietzsche called them
“Bildungsphilisters”—but
since the middle of the twentieth century they have been
proliferating like pond scum, and now hold much of the high
ground in universities, the media, think tanks, and senior
positions in the administrative state. They believe their
models (almost always linear and first-order) accurately
describe the behaviour of complex dynamic systems, and that they
can “nudge” the less-intellectually-exalted and
credentialed masses into virtuous behaviour, as defined by
them. When the masses dare to push back, having a limited
tolerance for fatuous nonsense, or being scolded by those who
have been consistently wrong about, well, everything, and dare
vote for candidates and causes which make sense to them and seem
better-aligned with the reality they see on the ground, they are
accused of—gasp—populism, and must be
guided in the proper direction by their betters, their uncouth
speech silenced in favour of the cultured
“consensus” of the few.
One of the reasons we seem to have many more IYIs around than we
used to, and that they have more influence over our lives is
related to scaling. As the author notes, “it is easier to
macrobull***t than microbull***t”. A grand theory which
purports to explain the behaviour of billions of people in a
global economy over a period of decades is impossible to test or
verify analytically or by simulation. An equally silly theory
that describes things within people's direct experience is
likely to be immediately rejected out of hand as the absurdity
it is. This is one reason decentralisation works so well: when
you push decision making down as close as possible to
individuals, their common sense asserts itself and immunises
them from the blandishments of IYIs.
The Lindy Effect
How can you sift the good and the enduring from the mass of
ephemeral fads and bad ideas that swirl around us every day?
The
Lindy effect
is a powerful tool. Lindy's delicatessen in New York City was a
favoured hangout for actors who observed that the amount of time
a show had been running on Broadway was the best predictor of
how long it would continue to run. A show that has run for three
months will probably last for at least three months more. A
show that has made it to the one year mark probably has another
year or more to go. In other words, the best test for whether
something will stand the test of time is whether it has
already withstood the test of time. This may, at
first, seem counterintuitive: a sixty year old person has a
shorter expected lifespan remaining than a twenty year old. The
Lindy effect applies only to nonperishable things such as
“ideas, books, technologies, procedures, institutions, and
political systems”.
Thus, a book which has been in print continuously for a hundred
years is likely to be in print a hundred years from now, while
this season's hot best-seller may be forgotten a few years
hence. The latest political or economic theory filling up pages
in the academic journals and coming onto the radar of the IYIs
in the think tanks, media punditry, and (shudder)
government agencies, is likely to be forgotten and/or
discredited in a few years while those with a pedigree of
centuries or millennia continue to work for those more
interested in results than trendiness.
Religion is Lindy. If you disregard all of the spiritual
components to religion, long-established religions are powerful
mechanisms to transmit accumulated wisdom, gained through
trial-and-error experimentation and experience over many
generations, in a ready-to-use package for people today. One
disregards or scorns this distilled experience at one's own
great risk. Conversely, one should be as sceptical about
“innovation” in ancient religious traditions and
brand-new religions as one is of shiny new ideas in any other
field.
(A few more technical notes…. As I keep saying,
“Once
Pareto
gets into your head, you'll never get him out.” It's no
surprise to find that the Lindy effect is deeply related to the
power-law distribution of many things in human experience. It's
simply another way to say that the lifetime of nonperishable
goods is distributed according to a power law just like incomes,
sales of books, music, and movie tickets, use of health care
services, and commission of crimes. Further, the Lindy effect
is similar to J. Richard Gott's
Copernican statement
of the
Doomsday
argument, with the difference that Gott provides lower and
upper bounds on survival time for a given
confidence
level predicted solely from a random observation that
something has existed for a known time.)
Uncertainty, Risk, and Decision Making
All of these observations inform dealing with risk and making
decisions based upon uncertain information. The key insight is
that in order to succeed, you must first survive. This
may seem so obvious as to not be worth stating, but many
investors, including those responsible for blow-ups which make
the headlines and take many others down with them, forget this
simple maxim. It is deceptively easy to craft an investment
strategy which will yield modest, reliable returns year in and
year out—until it doesn't. Such strategies tend to be
vulnerable to “tail risks”, in which an
infrequently-occurring event (such as 2008) can bring down the
whole house of cards and wipe out the investor and the fund.
Once you're wiped out, you're out of the game: you're like the
loser in a Russian roulette tournament who, after the gun goes
off, has no further worries about the probability of that
event. Once you accept that you will never have complete
information about a situation, you can begin to build a strategy
which will prevent your blowing up under any set of
circumstances, and may even be able to profit from volatility.
This is discussed in more detail in the author's earlier
Antifragile.
The Silver Rule
People and institutions who have skin in the game are likely to
act according to the Silver Rule: “Do not do to others
what you would not like them to do to you.” This rule,
combined with putting the skin of those “defence
intellectuals” sitting in air-conditioned offices into the
games they launch in far-off lands around the world, would do
much to save the lives and suffering of the young men and women
they send to do their bidding.
- Shlaes, Amity.
Coolidge.
New York: Harper Perennial, [2013] 2014.
ISBN 978-0-06-196759-7.
-
John Calvin Coolidge, Jr. was born in 1872 in Plymouth Notch,
Vermont. His family were among the branch of the Coolidge
clan who stayed in Vermont while others left its steep, rocky,
and often bleak land for opportunity in the Wild West of
Ohio and beyond when the Erie canal opened up these new
territories to settlement. His father and namesake made
his living by cutting wood, tapping trees for sugar, and
small-scale farming on his modest plot of land. He
diversified his income by operating a general store in
town and selling insurance. There was a long tradition
of public service in the family. Young Coolidge's great-grandfather
was an officer in the American Revolution and his grandfather
was elected to the Vermont House of Representatives. His
father was justice of the peace and tax collector in Plymouth Notch,
and would later serve in the Vermont House of Representatives
and Senate.
Although many in the cities would consider their rural life
far from the nearest railroad terminal hard-scrabble, the
family was sufficiently prosperous to pay for young
Calvin (the name he went by from boyhood) to attend private
schools, boarding with families in the towns where they
were located and infrequently returning home. He followed
a general college preparatory curriculum and, after failing the
entrance examination the first time, was admitted on his
second attempt to Amherst College as a freshman in 1891.
A loner, and already with a reputation for being taciturn,
he joined none of the fraternities to which his classmates
belonged, nor did he participate in the athletics which
were a part of college life. He quickly perceived that Amherst
had a class system, where the scions of old money families
from Boston who had supported the college were elevated
above nobodies from the boonies like himself. He concentrated
on his studies, mastering Greek and Latin, and immersing
himself in the works of the great orators of those cultures.
As his college years passed, Coolidge became increasingly
interested in politics, joined the college
Republican Club, and worked on the 1892 re-election campaign of
Benjamin Harrison, whose Democrat opponent, Grover Cleveland,
was seeking to regain the presidency he had lost to Harrison
in 1888. Writing to his father after Harrison's defeat, his
analysis was that “the reason seems to be in the never
satisfied mind of the American and in the ever desire to shift
in hope of something better and in the vague idea of the working
and farming classes that somebody is getting all the money
while they get all the work.”
His confidence growing, Coolidge began to participate in formal
debates, finally, in his senior year, joined a fraternity,
and ran for and won the honour of being an orator at his
class's graduation. He worked hard on the speech, which
was a great success, keeping his audience engaged and
frequently laughing at his wit. While still quiet in one-on-one
settings, he enjoyed public speaking and connecting with
an audience.
After graduation, Coolidge decided to pursue a career in the
law and considered attending law school at Harvard or Columbia
University, but decided he could not afford the tuition, as
he was still being supported by his father and had no prospects
for earning sufficient money while studying the law. In that
era, most states did not require a law school education; an
aspiring lawyer could, instead, become an apprentice at an
established law firm and study on his own, a practice called
reading the law.
Coolidge became an apprentice at a firm in Northampton, Massachusetts
run by two Amherst graduates and, after two years, in 1897, passed
the Massachusetts bar examination and was admitted to the bar.
In 1898, he set out on his own and opened a small law office
in Northampton; he had embarked on the career of a country
lawyer.
While developing his law practice, Coolidge followed in the
footsteps of his father and grandfather and entered public
life as a Republican, winning election to the Northampton
City Council in 1898. In the following years, he held the
offices of City Solicitor and county clerk of courts. In
1903 he married Grace Anna Goodhue, a teacher at the
Clarke School for the Deaf in Northampton. The next
year, running for the local school board, he suffered the
only defeat of his political career, in part because his
opponents pointed out he had no children in the schools.
Coolidge said, “Might give me time.” (The
Coolidges went on to have two sons, John, born in 1906,
and Calvin Jr., in 1908.)
In 1906, Coolidge sought statewide office for the first time,
running for the Massachusetts House of Representatives and
narrowly defeating the Democrat incumbent. He was re-elected
the following year, but declined to run for a third term,
returning to Northampton where he ran for mayor, won, and
served two one year terms. In 1912 he ran for the State Senate
seat of the retiring Republican incumbent and won. In the
presidential election of that year, when the Republican party
split between the traditional wing favouring William Howard
Taft and progressives backing Theodore Roosevelt, Coolidge,
although identified as a progressive, having supported women's
suffrage and the direct election of federal senators, among
other causes, stayed with the Taft Republicans and won
re-election. Coolidge sought a third term in 1914 and won,
being named President of the State Senate with substantial
influence on legislation in the body.
In 1915, Coolidge moved further up the ladder by running
for the office of Lieutenant Governor of Massachusetts,
balancing the Republican ticket led by a gubernatorial
candidate from the east of the state with his own
base of support in the rural west. In Massachusetts, the
Lieutenant Governor does not preside over the State Senate,
but rather fulfils an administrative role, chairing
executive committees. Coolidge presided over the finance
committee, which provided him experience in managing a
budget and dealing with competing demands from departments
that was to prove useful later in his career. After being
re-elected to the office in 1915 and 1916 (statewide offices
in Massachusetts at the time had a term of only one year),
with the governor announcing his retirement, Coolidge was
unopposed for the Republican nomination for governor and
narrowly defeated the Democrat in the 1918 election.
Coolidge took office at a time of great unrest between
industry and labour. Prices in 1918 had doubled from their
1913 level; nothing of the kind had happened since the
paper money inflation during the Civil War and its aftermath.
Nobody seemed to know why: it was usually
attributed to the war, but nobody understood the cause and
effect. There doesn't seem to have been a single
mainstream voice who observed that the rapid rise in
prices (which was really a depreciation of the dollar) began
precisely at the moment the
Creature
from Jekyll Island was unleashed upon the U.S. economy
and banking system. What was obvious, however, was that in
most cases industrial wages had not kept pace with the rise in
the cost of living, and that large companies which had raised
their prices had not correspondingly increased what they paid
their workers. This gave a powerful boost to the growing union
movement. In early 1919 an ugly
general
strike in Seattle idled workers across the city, and the
United Mine Workers threatened a nationwide coal strike for
November 1919, just as the maximum demand for coal in winter
would arrive. In Boston, police officers voted to unionise and
affiliate with the American Federation of Labor, ignoring an
order from the Police Commissioner forbidding officers to
join a union. On September 9th, a majority of policemen defied
the order and walked off the job.
Those who question the need for a police presence on the street
in big cities should consider the Boston police strike as a cautionary
tale, at least as things were in the city of Boston in the year
1919. As the Sun went down, the city erupted in chaos, mayhem,
looting, and violence. A streetcar conductor was shot for no
apparent reason. There were reports of rapes, murders, and serious
injuries. The next day, more than a thousand residents applied
for gun permits. Downtown stores were boarding up their
display windows and hiring private security forces. Telephone
operators and employees at the electric power plant threatened
to walk out in sympathy with the police. From Montana, where
he was campaigning in favour of ratification of the League
of Nations treaty, President Woodrow Wilson issued a mealy-mouthed
statement saying, “There is no use in talking about
political democracy unless you have also industrial
democracy”.
Governor Coolidge acted swiftly and decisively. He called up the
Guard and deployed them throughout the city, fired all of the
striking policemen, and issued a statement saying “The
action of the police in leaving their posts of duty is not a
strike. It is a desertion. … There is nothing to
arbitrate, nothing to compromise. In my personal opinion there
are no conditions under which the men can return to the force.”
He directed the police commissioner to hire a new force to
replace the fired men. He publicly rebuked American Federation of
Labor chief Samuel Gompers in a telegram released to the press
which concluded, “There is no right to strike against the
public safety by anybody, anywhere, any time.”
When the dust settled, the union was broken, peace was restored
to the streets of Boston, and Coolidge had emerged onto the
national stage as a decisive leader and champion of what he
called the “reign of law.” Later in 1919, he was
re-elected governor with seven times the margin of his first
election. He began to be spoken of as a potential candidate
for the Republican presidential nomination in 1920.
Coolidge was nominated at the 1920 Republican convention, but
never came in above sixth in the balloting, in the middle of
the pack of regional and favourite son candidates. On the
tenth ballot, Warren G. Harding of Ohio was chosen, and
party bosses announced their choice for Vice President, a
senator from Wisconsin. But when time came for delegates
to vote, a Coolidge wave among rank and file tired of the
bosses ordering them around gave him the nod. Coolidge did
not attend the convention in Chicago; he got the news of
his nomination by telephone. After he hung up, Grace asked
him what it was all about. He said, “Nominated for
vice president.” She responded, “You don't
mean it.” “Indeed I do”, he answered.
“You are not going to accept it, are you?”
“I suppose I shall have to.”
Harding ran on a platform of “normalcy” after the
turbulence of the war and Wilson's helter-skelter progressive
agenda. He expressed his philosophy in a speech several months
earlier,
America's present need is not heroics, but healing; not
nostrums, but normalcy; not revolution, but restoration; not
agitation, but adjustment; not surgery, but serenity; not the
dramatic, but the dispassionate; not experiment, but equipoise;
not submergence in internationality, but sustainment in
triumphant nationality. It is one thing to battle successfully
against world domination by military autocracy, because the
infinite God never intended such a program, but it is
quite another to revise human nature and suspend the
fundamental laws of life and all of life's acquirements.
The election was a blow-out. Harding and Coolidge won the
largest electoral college majority (404 to 127) since James
Monroe's unopposed re-election in 1820, and more than 60% of the
popular vote. Harding carried every state except for the Old South,
and was the first Republican to win Tennessee since
Reconstruction. Republicans picked up 63 seats in the House,
for a majority of 303 to 131, and 10 seats in the Senate, with
59 to 37. Whatever Harding's priorities, he was likely to be
able to enact them.
The top priority in Harding's quest for normalcy was federal
finances. The Wilson administration and the Great War had
expanded the federal government into
terra incognita.
Between 1789 and 1913, when Wilson took office, the U.S. had
accumulated a total of US$2.9 billion in public debt. When
Harding was inaugurated in 1921, the debt stood at US$24
billion, more than a factor of eight greater. In 1913, total
federal spending was US$715 million; by 1920 it had ballooned to
US$6358 million, almost nine times more. The top marginal
income tax rate, 7% before the war, was 70% when Harding took
the oath of office, and the cost of living had approximately doubled
since 1913, which shouldn't have been a surprise (although it
was largely unappreciated at the time), because a complaisant
Federal Reserve had doubled the money supply from US$22.09
billion in 1913 to US$48.73 billion in 1920.
At the time, federal spending worked much as it had in the
early days of the Republic: individual agencies presented
their spending requests to Congress, where they battled against
other demands on the federal purse, with congressional
advocates of particular agencies doing deals to get what
they wanted. There was no overall budget process worthy of
the name (or as existed in private companies a fraction the
size of the federal government), and the President, as chief
executive, could only sign or veto individual spending bills,
not an overall budget for the government. Harding had campaigned
on introducing a formal budget process and made this his
top priority after taking office. He called an extraordinary
session of Congress and, making the most of the Republican
majorities in the House and Senate, enacted a bill which created
a Budget Bureau in the executive branch, empowered the president
to approve a comprehensive budget for all federal expenditures,
and even allowed the president to reduce agency spending of
already appropriated funds. The budget would be a central
focus for the next eight years.
Harding also undertook to dispose of surplus federal assets
accumulated during the war, including naval petroleum reserves.
This, combined with Harding's penchant for cronyism, led to a
number of scandals which tainted the reputation of his
administration. On August 2nd, 1923, while on a speaking tour of the
country promoting U.S. membership in the World Court, he
suffered a heart attack and died in San Francisco. Coolidge,
who was visiting his family in Vermont, where there was no
telephone service at night, was awakened to learn that he
had succeeded to the presidency. He took the oath of office
by kerosene light in his parents' living room, administered
by his father, a Vermont notary public. As he left Vermont
for Washington, he said, “I believe I can swing it.”
As Coolidge was in complete agreement with Harding's policies,
if not his style and choice of associates, he interpreted
“normalcy” as continuing on the course set by
his predecessor. He retained Harding's entire cabinet
(although he had his doubts about some of its more dodgy
members), and began to work closely with his budget
director,
Herbert Lord,
meeting with him weekly before the full cabinet meeting.
Their goal was to continue to cut federal spending,
generate surpluses to pay down the public debt, and
eventually cut taxes to boost the economy and leave more money
in the pockets of those who earned it. He had a powerful ally
in these goals in Treasury secretary
Andrew Mellon,
who went further and advocated his theory of “scientific
taxation”. He argued that the existing high tax rates
not only hampered economic growth but actually reduced the
amount of revenue collected by the government. Just as a
railroad's profits would suffer from a drop in traffic if it
set its freight rates too high, a high tax rate would deter
individuals and companies from making more taxable income.
What was crucial was the “top marginal tax rate”: the
tax paid on the next additional dollar earned. With the tax
rate on high earners at the postwar level of 70%, individuals
got to keep only thirty cents of each additional dollar they
earned; many would not bother putting in the effort.
Half a century later, Mellon would have been called a
“supply sider”, and his ideas were just as
valid as when they were applied in the Reagan administration
in the 1980s. Coolidge wasn't sure he agreed with all of
Mellon's theory, but he was 100% in favour of cutting the
budget, paying down the debt, and reducing the tax burden
on individuals and business, so he was willing to give it
a try. It worked. The last budget submitted by the Coolidge
administration (fiscal year 1929) was 3.127 billion, less
than half of fiscal year 1920's expenditures. The public
debt had been paid down from US$24 billion go US$17.6
billion, and the top marginal tax rate had been more than
halved from 70% to 31%.
Achieving these goals required constant vigilance and an
unceasing struggle with the congress, where politicians of
both parties regarded any budget surplus or increase in
revenue generated by lower tax rates and a booming
economy as an invitation to spend, spend, spend. The Army
and Navy argued for major expenditures to defend the
nation from the emerging threat posed by aviation. Coolidge's
head of defense aviation observed that the Great Lakes had
been undefended for a century, yet Canada had not so far
invaded and occupied the Midwest and that, “to create a
defense system based upon a hypothetical attack from
Canada, Mexico, or another of our near neighbors would
be wholly unreasonable.” When devastating floods
struck the states along the Mississippi, Coolidge was
steadfast in insisting that relief and recovery were the
responsibility of the states. The New York Times
approved, “Fortunately, there are still some things
that can be done without the wisdom of Congress and the
all-fathering Federal Government.”
When Coolidge succeeded to the presidency, Republicans were
unsure whether he would run in 1924, or would obtain the
nomination if he sought it. By the time of the convention in
June of that year, Coolidge's popularity was such that he was
nominated on the first ballot. The 1924 election was another
blow-out, with Coolidge winning 35 states and 54% of the
popular vote. His Democrat opponent, John W. Davis, carried
just the 12 states of the “solid South” and won
28.8% of the popular vote, the lowest popular vote
percentage of any Democrat candidate to this day. Robert
La Follette of Wisconsin, who had challenged Coolidge for
the Republican nomination and lost, ran as a Progressive,
advocating higher taxes on the wealthy and nationalisation
of the railroads, and won 16.6% of the popular vote and
carried the state of Wisconsin and its 13 electoral votes.
Tragedy struck the Coolidge family in the White House in 1924
when his second son, Calvin Jr., developed a blister while
playing tennis on the White House courts. The blister
became infected with Staphylococcus aureus, a
bacterium which is readily treated today with penicillin
and other antibiotics, but in 1924 had no treatment
other than hoping the patient's immune system would throw
off the infection. The infection spread to the blood and
sixteen year old Calvin Jr. died on July 7th, 1924. The
president was devastated by the loss of his son and never
forgave himself for bringing his son to Washington where
the injury occurred.
In his second term, Coolidge continued the policies of
his first, opposing government spending programs, paying down
the debt through budget surpluses, and cutting taxes. When
the mayor of Johannesburg, South Africa, presented the
president with two lion cubs, he named them “Tax
Reduction” and “Budget Bureau” before
donating them to the National Zoo. In 1927, on vacation
in South Dakota, the president issued a characteristically
brief statement, “I do not choose to run for
President in nineteen twenty eight.” Washington
pundits spilled barrels of ink parsing Coolidge's twelve
words, but they meant exactly what they said: he had had
enough of Washington and the endless struggle against big
spenders in Congress, and (although re-election was
considered almost certain given his landslide the last
time, popularity, and booming economy) considered ten
years in office (which would have been longer than any
previous president) too long for any individual to
serve. Also, he was becoming increasingly concerned
about speculation in the stock market, which had more
than doubled during his administration and would
continue to climb in its remaining months. He was
opposed to government intervention in the markets and,
in an era before the Securities and Exchange Commission,
had few tools with which to do so. Edmund Starling, his
Secret Service bodyguard and frequent companion on walks,
said, “He saw economic disaster ahead”, and
as the 1928 election approached and it appeared that
Commerce Secretary Herbert Hoover would be the Republican
nominee, Coolidge said, “Well, they're going to elect that
superman Hoover, and he's going to have some trouble. He's
going to have to spend money. But he won't spend enough.
Then the Democrats will come in and they'll spend money
like water. But they don't know anything about money.”
Coolidge may have spoken few words, but when he did he was
worth listening to.
Indeed, Hoover was elected in 1928 in another Republican
landslide (40 to 8 states, 444 to 87 electoral votes, and
58.2% of the popular vote), and things played out exactly
as Coolidge had foreseen. The 1929 crash triggered a
series of moves by Hoover which undid most of the patient
economies of Harding and Coolidge, and by the time Hoover was
defeated by Franklin D. Roosevelt in 1932, he had added
33% to the national debt and raised the top marginal
personal income tax rate to 63% and corporate taxes by 15%.
Coolidge, in retirement, said little about Hoover's policies
and did his duty to the party, campaigning for him in the
foredoomed re-election campaign in 1932. After the
election, he remarked to an editor of the New York
Evening Mail, “I have been out of touch so
long with political activities I feel that I no longer
fit in with these times.” On January 5, 1933,
Coolidge, while shaving, suffered a sudden heart attack
and was found dead in his dressing room by his wife
Grace.
Calvin Coolidge was arguably the last U.S. president to
act in office as envisioned by the Constitution. He advanced
no ambitious legislative agenda, leaving lawmaking to Congress.
He saw his job as similar to an executive in a business,
seeking economies and efficiency, eliminating waste and
duplication, and restraining the ambition of subordinates
who sought to broaden the mission of their departments
beyond what had been authorised by Congress and the
Constitution. He set difficult but limited goals for
his administration and achieved them all, and he
was popular while in office and respected after leaving it.
But how quickly it was all undone is a lesson in how
fickle the electorate can be, and how tempting ill-conceived
ideas are in a time of economic crisis.
This is a superb history of Coolidge and his time, full of
lessons for our age which has veered so far from the
constitutional framework he so respected.
- Carr, Jack.
True Believer.
New York: Atria Books, 2019.
ISBN 978-1-5011-8084-2.
-
Jack Carr, a former U.S. Navy SEAL, burst into the world of
thriller authors with 2018's stunning success,
The Terminal List (September 2018).
In it, he introduced James Reece, a SEAL whose team was
destroyed by a conspiracy reaching into the highest levels
of the U.S. government and, afflicted with a brain tumour
by a drug tested on him and his team without their knowledge
or consent, which he expected to kill him, set out for
revenge upon those responsible. As that novel concluded,
Reece, a hunted man, took to the sea in a sailboat, fully
expecting to die before he reached whatever destination he
might choose.
This sequel begins right where the last book ended. James
Reece is aboard the forty-eight foot sailboat
Bitter Harvest braving the rough November
seas of the North Atlantic and musing that as a
Lieutenant Commander in the U.S. Navy he knew very little
about sailing a boat in the open ocean. With supplies
adequate to go almost anywhere he desires, and not
necessarily expecting to live until his next landfall
anyway, he decides on an ambitious voyage to see an old
friend far from the reach of the U.S. government.
While Reece is at sea, a series of brazen and bloody terrorist
attacks in Europe against civilian and military targets send
analysts on both sides of the Atlantic digging through their
resources to find common threads which might point back to
whoever is responsible, as their populace becomes increasingly
afraid of congregating in public.
Reece eventually arrives at a hunting concession in Mozambique,
in southeast Africa, and signs on as an apprentice professional
hunter, helping out in tracking and chasing off poachers who
plague the land during the off-season. This suits him just fine:
he's about as far off the grid as one can get in this over-connected
world, among escapees from Rhodesia who understand what it's like
to lose their country, surrounded by magnificent scenery and
wildlife, and actively engaged in putting his skills to work
defending them from human predators. He concludes he could get
used to this life, for however long as he has to live.
This idyll comes to an end when he is tracked down by another
former SEAL, now in the employ of the CIA, who tells Reece that
a man he trained in Iraq is suspected of being involved in the
terrorist attacks and that if Reece will join in an effort to
track him down and get him to flip on his terrorist masters,
the charges pending against Reece will be dropped and he can
stop running and forever looking over his shoulder. After
what the U.S. government has done to him, his SEAL team, and
his family, Reece's inclination is to tell them to pound sand.
Then, as always, the eagle flashes its talons and Reece is told
that if he fails to co-operate the Imperium will go after
all of those who helped him avenge the wrongs it inflicted upon
him and escape its grasp. With that bit of Soviet-style recruiting
out of the way, Reece is off to a CIA black site in the
REDACTED region of REDACTED to train with REDACTED for his
upcoming mission. (In this book, like the last, passages
which are said to have been required to have been struck
during review of the manuscript by the Department of Defense
Office of Prepublication and Security Review are blacked out
in the text. This imparted a kind of frisson
and authenticity the first time out, but now it's getting
somewhat tedious—just change the details, Jack, and
get on with it!)
As Reece prepares for his mission, events lead him to believe
he is not just confronting an external terrorist threat but,
once again, forces within the U.S. government willing to
kill indiscriminately to get their way. Finally, the time
comes to approach his former trainee and get to the bottom
of what is going on. From this point on, the story is what
you'd expect of a thriller, with tradecraft, intrigue,
betrayal, and discovery of a dire threat with extreme
measures taken under an imminent deadline to avoid catastrophe.
The pacing of the story is…odd. The entire first third
of the book is largely occupied by Reece sailing his boat and
working at the game reserve. Now, single-handedly sailing
a sailboat almost halfway around the globe is challenging and
an adventure, to be sure, and a look inside the world of an
African hunting reserve is intriguing, but these are not what thriller
readers pay for, nor do they particularly develop the character
of James Reece, employ his unique skills, or reveal things about
him we don't already know. We're half way through the book
before Reece achieves his first goal of making contact with
his former trainee, and it's only there that the real mission
gets underway. And as the story ends, although a number of
villains have been dispatched in satisfying ways, two of those
involved in the terrorist plot (but not its masterminds) remain
at large, for Reece to hunt down, presumably in the next book,
in a year or so. Why not finish it here, then do something
completely different next time?
I hope international agents don't take their tax advice from
this novel. The CIA agent who “recruits” Reece
tells him “It's a contracted position. You won't pay
taxes on most of it as long as you're working overseas.”
Wrong! U.S. citizens (which Reece, more fool him, remains)
owe U.S. taxes on all of their worldwide income, regardless
of the source. There is an exclusion for salary income from
employment overseas, but this would not apply for payments by
the CIA to an independent contractor. Later in the book, Reece
receives a large cash award from a foreign government for dispatching
a terrorist, which he donates to support the family of a
comrade killed in the operation. He would owe around 50% of the
award as federal and California state income taxes (since his
last U.S. domicile was the once-golden state) off the top, and
unless he was extraordinarily careful (which there is no evidence
he was), he'd get whacked again with gift tax as punishment for
his charity. Watch out, Reece, if you think having the FBI,
CIA, and Naval Criminal Investigative Service on your tail is
bad, be glad you haven't yet crossed the IRS or the California
Franchise Tax Board!
The Kindle edition does not have the attention to detail you'd
expect from a Big Five New York publisher (Simon and Schuster)
in a Kindle book selling for US$13. In five places in the text,
HTML character entity codes like “&8201;” (the
code for the thin space used between adjacent single and double
quote marks) appear in the text. What this says to me is that
nobody at this professional publishing house did a page-by-page
proof of the Kindle edition before putting it on sale. I don't
know of a single independently-published science fiction author
selling works for a fraction of this price who would fail to do
this.
This is a perfectly competent thriller, but to this reader
it does not come up to the high standard set by the debut
novel. You should not read this book without reading
The Terminal List first; if you don't, you'll
miss most of the story of what made James Reece who he is
here.
- Griffin, G. Edward.
The Creature from Jekyll Island.
Westlake Village, CA: American Media, [1994, 1995, 1998, 2002] 2010.
ISBN 978-0-912986-45-6.
-
Almost every time I review a book about or discuss the U.S.
Federal Reserve System in a conversation or Internet post,
somebody recommends this book. I'd never gotten around to
reading it until recently, when a couple more mentions of it
pushed me over the edge. And what an edge that turned out
to be. I cannot recommend this book to anybody; there are
far more coherent, focussed, and persuasive analyses of
the Federal Reserve in print, for example Ron Paul's excellent
book End the Fed (October 2009).
The present book goes well beyond a discussion of the Federal
Reserve and rambles over millennia of history in a chaotic
manner prone to induce temporal vertigo in the reader, discussing
the history of money, banking, political manipulation of
currency, inflation, fractional reserve banking, fiat
money, central banking, cartels, war profiteering,
bailouts, monetary panics and bailouts, nonperforming loans
to “developing” nations, the Rothschilds and
Rockefellers, booms and busts, and more.
The author is inordinately fond of conspiracy theories. As
we pursue our random walk through history and around the
world, we encounter:
- The sinking of the Lusitania
- The assassination of Abraham Lincoln
- The Order of the Knights of the Golden Circle,
the Masons, and the Ku Klux Klan
- The Bavarian Illuminati
- Russian Navy intervention in the American Civil War
- Cecil Rhodes and the Round Table Groups
- The Council on Foreign Relations
- The Fabian Society
- The assassination of John F. Kennedy
- Theodore Roosevelt's “Bull Moose” run
for the U.S. presidency in 1912
- The Report from Iron Mountain
- The attempted assassination of Andrew Jackson in 1835
- The Bolshevik Revolution in Russia
I've jumped around in history to give a sense of the
chaotic, achronological narrative here. “What does
this have to do with the Federal Reserve?”, you
might ask. Well, not very much, except as part of a
worldview in which almost everything is explained by the
machinations of bankers assisted by the crooked politicians
they manipulate.
Now, I agree with the author, on those occasions he
actually gets around to discussing the Federal Reserve,
that it was fraudulently sold to Congress and the
U.S. population and has acted, from the very start, as
a self-serving cartel of big New York banks enriching
themselves at the expense of anybody who holds assets
denominated in the paper currency they have been inflating
away ever since 1913. But you don't need to invoke
conspiracies stretching across the centuries and around
the globe to explain this. The Federal Reserve is
(despite how it was deceptively structured and
promoted) a central bank, just like the Bank of England
and the central banks of other European countries
upon which it was modelled, and creating funny money
out of thin air and looting the population by the
hidden tax of inflation is what central banks do,
always have done, and always will, as long as they are
permitted to exist. Twice in the history of the U.S.
prior to the establishment of the Federal Reserve,
central banks were created, the
first in 1791
by Alexander Hamilton, and the
second
in 1816. Each time, after the abuses of such an
institution became apparent, the bank was abolished,
the first in 1811, and the second in 1836. Perhaps,
after the inevitable crack-up which always results from
towering debt and depreciating funny money, the
Federal Reserve will follow the first two central banks
into oblivion, but so deeply is it embedded in the
status quo it is difficult to see how that might happen
today.
In addition to the rambling narrative, the production values
of the book are shoddy. For a book which has gone through
five editions and 33 printings, nobody appears to have
spent the time giving the text even the most cursory of
proofreading. Without examining it with the critical eye
I apply when proofing my own work or that of others, I noted
137 errors of spelling, punctuation, and formatting in the text.
Paragraph breaks are inserted seemingly at random, right in
the middle of sentences, and other words are run
together. Words which are misspelled include
“from”, “great”, “fourth”,
and “is”. This is not a freebie or dollar
special, but a paperback which sells for US$20 at
Amazon, or US$18 for the Kindle edition. And as I
always note, if the author and publisher cannot be
bothered to get simple things like these correct, how
likely is it that facts and arguments in the text can
be trusted?
Don't waste your money or your time. Ron Paul's End the
Fed is much better, only a third the length, and
concentrates on the subject without all of the whack-a-doodle
digressions. For a broader perspective on the history of money,
banking, and political manipulation of currency, see Murray
Rothbard's classic What Has Government Done
to Our Money? (July 2019).
- Butler, Smedley D.
War Is a Racket.
San Diego, CA: Dauphin Publications, [1935] 2018.
ISBN 978-1-939438-58-4.
-
Smedley Butler knew a thing or two about war. In 1898, a little
over a month before his seventeenth birthday, he lied about
his age and enlisted in the U.S. Marine Corps, which
directly commissioned him a second lieutenant. After
completing training, he was sent to Cuba, arriving shortly
after the end of the Spanish-American War. Upon returning
home, he was promoted to first lieutenant and sent to the
Philippines as part of the American garrison. There,
he led Marines in combat against Filipino rebels. In 1900
he was deployed to China during the Boxer Rebellion and
was wounded in the Gaselee Expedition, being promoted to
captain for his bravery.
He then served in the “Banana Wars” in Central
America and the Caribbean. In 1914, during a conflict in
Mexico, he carried out an undercover mission in support of
a planned U.S. intervention. For his command in the
battle of Veracruz, he was awarded the Medal of Honor. Next,
he was sent to Haiti, where he commanded Marines and Navy
troops in an attack on Fort Rivière in November
1915. For this action, he won a second Medal of Honor.
To this day, he is only one of nineteen people to have twice
won the Medal of Honor.
In World War I he did not receive a combat command, but for
his work in commanding the debarkation camp in France for
American troops, he was awarded both the Army and Navy
Distinguished Service Medals. Returning to the U.S. after
the armistice, he became commanding general of the Marine
training base at Quantico, Virginia. Between 1927 and 1929
he commanded the Marine Expeditionary Force in China, and
returning to Quantico in 1929, he was promoted to Major General,
then the highest rank available in the Marine Corps (which
was subordinate to the Navy), becoming the youngest person
in the Corps to attain that rank. He retired from the
Marine Corps in 1931.
In this slim pamphlet (just 21 pages in the Kindle edition
I read), Butler demolishes the argument that the U.S. military
actions in which he took part in his 33 years as a Marine had
anything whatsoever to do with the defence of the United States.
Instead, he saw lives and fortune squandered on foreign adventures
largely in the interest of U.S. business interests, with
those funding and supplying the military banking large
profits from the operation. With the introduction of
conscription in World War I, the cynical exploitation of
young men reached a zenith with draftees paid US$30
a month, with half taken out to support dependants,
and another bite for mandatory insurance, leaving less
than US$9 per month for putting their lives on the line.
And then, in a final insult, there was powerful coercion
to “invest” this paltry sum in “Liberty
Bonds” which, after the war, were repaid well below
the price of purchase and/or in dollars which had lost
half their purchasing power.
Want to put an end to endless, futile, and tragic wars?
Forget disarmament conferences and idealistic initiatives,
Butler says,
The only way to smash this racket is to conscript capital
and industry and labor before the nations [sic] manhood
can be conscripted. One month before the Government can
conscript the young men of the nation—it must conscript
capital and industry. Let the officers and the directors
and the high-powered executives of our armament factories
and our shipbuilders and our airplane builders and the
manufacturers of all the other things that provide profit in
war time as well as the bankers and the speculators, be
conscripted—to get $30 a month, the same wage as the
lads in the trenches get.
Let the workers in these plants get the same wages—all
the workers, all presidents, all directors, all managers,
all bankers—yes, and all generals and all admirals and all
officers and all politicians and all government office
holders—everyone in the nation be restricted to a
total monthly income not to exceed that paid to the
soldier in the trenches!
Let all these kings and tycoons and masters of business and
all those workers in industry and all our senators and
governors and majors [I think “mayors” was
intended —JW] pay half their monthly $30 wage to their
families and pay war risk insurance and buy Liberty Bonds.
Why shouldn't they?
Butler goes on to recommend that any declaration of war require
approval by a national plebiscite in which voting would be
restricted to those subject to conscription in a military
conflict. (Writing in 1935, he never foresaw that young
men and women would be sent into combat without so much as a
declaration of war being voted by Congress.) Further,
he would restrict all use of military force to genuine
defence of the nation, in particular, limiting the Navy to
operating no more than 200 miles (320 km) from the coastline.
This is an impassioned plea against the folly of foreign
wars by a man whose career was as a warrior. One can
argue that there is a legitimate interest in, say assuring
freedom of navigation in international waters, but looking
back on the results of U.S. foreign wars in the 21st century,
it is difficult to argue they can be justified any more than
the “Banana Wars” Butler fought in his time.