« August 2012 | Main | October 2012 »

Sunday, September 30, 2012

Floating Point Benchmark: COBOL Added

I have posted an update to my trigonometry-intense floating point benchmark which adds COBOL to the list of languages in which the benchmark is implemented. A new release of the benchmark collection including COBOL is now available for downloading.

The COBOL benchmark was developed with the OpenCOBOL 1.1.0 compiler under Ubuntu Linux 11.04. Unfortunately, this open source implementation of COBOL does not deliver the accuracy for floating point computations the benchmark requires—results typically differ in the sixth decimal place or beyond. Since I only quote results for benchmarks which produce identical results to the 11th decimal place, timings for OpenCOBOL do not appear in the following table.

I moved the benchmark code to a Windows 7 machine on which I had installed an evaluation version of Micro Focus Visual COBOL 2010 R4 Version 1.3.00046. This compiler created a program which produced the expected results from the computation. The speed comparison was made against the C benchmark compiled with Microsoft Visual C++ 2010 Express version 10.0.30319.1 RTMRel and run on the same machine.

The relative performance of the various language implementations (with C taken as 1) is as follows. All implementations of the benchmark listed below produced identical results to the last (11th) decimal place.

Language Relative
Time
Details
C 1 GCC 3.2.3 -O3, Linux
Visual Basic .NET 0.866 All optimisations, Windows XP
FORTRAN 1.008 GNU Fortran (g77) 3.2.3 -O3, Linux
Pascal 1.027
1.077
Free Pascal 2.2.0 -O3, Linux
GNU Pascal 2.1 (GCC 2.95.2) -O3, Linux
Java 1.121 Sun JDK 1.5.0_04-b05, Linux
Visual Basic 6 1.132 All optimisations, Windows XP
Haskell 1.223 GHC 7.4.1-O2 -funbox-strict-fields, Linux
Ada 1.401 GNAT/GCC 3.4.4 -O3, Linux
Lisp 7.41
19.8
GNU Common Lisp 2.6.7, Compiled, Linux
GNU Common Lisp 2.6.7, Interpreted
Smalltalk 7.59 GNU Smalltalk 2.3.5, Linux
COBOL 12.5
46.3
Micro Focus Visual COBOL 2010, Windows 7
Fixed decimal instead of computational-2
Python 17.6 Python 2.3.3 -OO, Linux
Perl 23.6 Perl v5.8.0, Linux
Ruby 26.1 Ruby 1.8.3, Linux
JavaScript 27.6
39.1
46.9
Opera 8.0, Linux
Internet Explorer 6.0.2900, Windows XP
Mozilla Firefox 1.0.6, Linux
QBasic 148.3 MS-DOS QBasic 1.1, Windows XP Console

Two versions of the benchmark program were run. Both contain identical logic in the PROCEDURE DIVISION, but declare decimal variables in different ways. The first uses COMPUTATIONAL-2, which most modern COBOL implementations map into native 64 bit IEEE floating point; this program, compiled with Visual COBOL, runs 12.5 times slower than the C benchmark. The second declares these variables as fixed point decimal computational quantities, for example:

    30 INDEX-OF-REFRACTION USAGE IS COMPUTATIONAL
                PICTURE IS S99V9(16) VALUE IS 1.6164.

As expected, this causes a large performance hit: this edition runs 46.3 times slower than C. The numerical results of both versions are identical.

It is, of course, an act of utter gibbering lunacy to do scientific computation like this in COBOL, but it is nice to know that there are implementations of COBOL available which will do the job, if asked, albeit at a languid pace.

Posted at 14:21 Permalink

Saturday, September 29, 2012

Computing: MD5 Command-Line Utility Updated

I have just posted an update to the source code of the command-line MD5 utility. The new release, designated version 2.3, corrects problems when building on 64 bit Intel or AMD x86 platforms (x86_64 architecture). The MD5 algorithm requires variables which are guaranteed to be 32 bit unsigned integers. The code which defines this data type failed to properly set it on x86_64 builds. I replaced the previous ad hoc architecture detection with an include of the C99 stdint.h header file, which provides a definition of uint32_t which is guaranteed to be correct for all architectures.

The logic which sets the HIGHFIRST performance optimisation flag for big-endian architectures failed to detect x86_64 machines as little-endian. I added tests for __x86_64__ and __amd64__ so at least it should work out of the box with GCC. I also fixed a compiler warning in the editing of the message which appears when HIGHFIRST is set incorrectly.

Since these fixes only affect builds from source code on x86_64 platforms, the ready to run Win32 executable was not rebuilt.

Posted at 16:58 Permalink

Friday, September 28, 2012

Reading List: Nuclear Assault

Imholt, Timothy James. Nuclear Assault. Unknown: Zwicky Press, 2012. ISBN 978-0-615-69158-9.
I am not going to fret about spoilers in this review. This book is so awful that nobody should read it, and avoiding spoilers is like worrying about getting a dog turd dirty when you pick it up with toilet paper to throw it in the loo.

I acquired this book based on an Amazon suggestion of “Customers who Viewed this Item Also Viewed” and especially because, at the time I encountered it, the Kindle edition was free (it is no longer, as of this writing). Well, I'm always a sucker for free stuff, so I figured, “How bad can it be?” and downloaded it. How wrong I was—even for free, this botched attempt at a novel is overpriced.

Apart from the story, which is absurd, the author has not begun to master the basics of English composition. If I had taken a chapter or two from this novel and submitted it as a short story in my 10th grade English class, I would have received a failing grade, and deservedly so. Scarcely a page in this 224 page novel is unmarred by errors of orthography, grammar, or punctuation. The author appears to have invented his own way of expressing quotes. The following is a partial list of words in the text which are either misspelled or for which homonyms are incorrectly used:

Americans OK advice affected an arrival assess attack bathe become breathe chaperone closed continuous counsel enemy's feet first foul from had hangar harm's hero holding host hostilely intelligence it's its let's morale nights not ordnance overheard pus rarefied scientists sent sights sure the their them they times were

When you come across an instance of “where” being used in place of “were”, you might put it down to the kind of fat finger we all commit from time to time, plus sloppy proofreading. But when it happens 13 times in 224 pages, you begin to suspect the author might not really comprehend the difference between the two.

All of the characters, from special forces troops, emergency room nurses, senior military commanders, the President of the United States, to Iranian nuclear scientists speak in precisely the same dialect of fractured grammar laced with malaprops. The author has his own eccentric idea of what words should be capitalised, and applies them inconsistently. Each chapter concludes with a “news flash” and “economic news flash”, also in bizarro dialect, with the latter demonstrating the author as illiterate in economics as he is in the English language.

Then, in the last line of the novel, the reader is kicked in the teeth with something totally out of the blue.

I'd like to call this book “eminently forgettable”, but I doubt I'll forget it soon. I have read a number of manuscripts by aspiring writers (as a savage copy editor and fact checker, authors occasionally invite me to have at their work, in confidence, before sending it for publication), but this is, by far, the worst I have encountered in my entire life. You may ask why I persisted in reading beyond the first couple of chapters. It's kind of like driving past a terrible accident on the highway—do you really not slow down and look? Besides, I only review books I've finished, and I looked forward to this review as the only fun I could derive from this novel, and writing this wave-off a public service for others who might stumble upon this piece of…fiction and be inclined to pick it up.

Posted at 21:20 Permalink

Thursday, September 27, 2012

Reading List: Turing & Burroughs

Rucker, Rudy. Turing & Burroughs. Los Gatos, CA: Transreal Books, 2012. ISBN 978-0-9858272-3-6.
The enigmatic death of Alan Turing has long haunted those who inquire into the life of this pioneer of computer science. Forensic tests established cyanide poisoning as the cause of his death, and the inquest ruled it suicide by eating a cyanide-laced apple. But the partially-eaten apple was never tested for cyanide, and Turing's mother, among other people close to him, believed the death an accident, due to ingestion of cyanide fumes from an experiment in gold plating he was known to be conducting. Still others pointed out that Turing, from his wartime work at Bletchley Park, knew all the deepest secrets of Britain's wartime work in cryptanalysis, and having been shamefully persecuted by the government for his homosexuality, might have been considered a security risk and targeted to be silenced by dark forces of the state.

This is the point of departure for this delightful alternative history romp set in the middle of the 1950s. In the novel, Turing is presumed to have gotten much further with his work on biological morphogenesis than history records. So far, in fact, that when agents from Her Majesty's spook shop botch an assassination attempt and kill his lover instead, he is able to swap faces with him and flee the country to the anything-goes international zone of Tangier.

There, he pursues his biological research, hoping to create a perfect undifferentiated tissue which can transform itself into any structure or form. He makes the acquaintance of novelist William S. Burroughs, who found in Tangier's demimonde a refuge from the scandal of the death of his wife in Mexico and his drug addiction. Turing eventually succeeds, creating a lifeform dubbed the “skug”, and merges with it, becoming a skugger. He quickly discovers that his endosymbiont has not only dramatically increased his intelligence, but also made him a shape-shifter—given the slightest bit of DNA, a skugger can perfectly imitate its source.

And not just that…. As Turing discovers when he recruits Burroughs to skugdom, skuggers are able to enskug others by transferring a fragment of skug tissue to them; they can conjugate, exchanging “wetware” (memories and acquired characteristics); and they are telepathic among one another, albeit with limited range. Burroughs, whose explorations of pharmaceutical enlightenment had been in part motivated by a search for telepathy (which he called TP), found he rather liked being a skugger and viewed it as the next step in his personal journey.

But Turing's escape from Britain failed to completely cover his tracks, and indiscretions in Tangier brought him back into the crosshairs of the silencers. Shape-shifting into another identity, he boards a tramp steamer to America, where he embarks upon a series of adventures, eventually joined by Burroughs and Allen Ginsberg, on the road from Florida to Los Alamos, New Mexico, Burroughs's childhood stomping grounds, where Stanislaw Ulam, co-inventor of the hydrogen bomb and, like Turing, fascinated with how simple computational systems such as cellular automata can mimic the gnarly processes of biology, has been enlisted to put an end to the “skugger menace”—perhaps a greater threat than the international communist conspiracy.

Using his skugger wiles, Turing infiltrates Los Alamos and makes contact, both physically and intellectually, with Ulam, and learns the details of the planned assault on the skugs and vows to do something about it—but what? His human part pulls him one way and his skug another.

The 1950s are often thought of as a sterile decade, characterised by conformity and paranoia. And yet, if you look beneath the surface, the seeds of everything that happened in the sixties were sown in those years. They may have initially fallen upon barren ground, but like the skug, they were preternaturally fertile and, once germinated, spread at a prodigious rate.

In the fifties, the consensus culture bifurcated into straights and beats, the latter of which Burroughs and Ginsberg were harbingers and rôle models for the emerging dissident subculture. The straights must have viewed the beats as alien—almost possessed: why else would they reject the bounty of the most prosperous society in human history which had, just a decade before, definitively defeated evil incarnate? And certainly the beats must have seen the grey uniformity surrounding them as also a kind of possession, negating the human potential in favour of a cookie-cutter existence, where mindless consumption tried to numb the anomie of a barren suburban life. This mutual distrust and paranoia was to fuel such dystopian visions as Invasion of the Body Snatchers, with each subculture seeing the other as pod people.

In this novel, Rucker immerses the reader in the beat milieu, with the added twist that here they really are pod people, and loving it. No doubt the beats considered themselves superior to the straights. But what if they actually were? How would the straights react, and how would a shape-shifting, telepathic, field-upgradable counterculture respond?

Among the many treats awaiting the reader is the author's meticulous use of British idioms when describing Turing's thoughts and Burroughs's idiosyncratic grammar in the letters in his hand which appear here.

This novel engages the reader to such an extent that it's easy to overlook the extensive research that went into making it authentic, not just superficially, but in depth. Readers interested in what goes into a book like this will find the author's background notes (PDF) fascinating—they are almost as long as the novel. I wouldn't, however, read them before finishing the book, as spoilers lurk therein.

A Kindle edition is available either from Amazon or directly from the publisher, where an EPUB edition is also available (with other formats forthcoming).

Fourmilog epilogue. I don't do “process” in book reviews, but here at Fourmilog we're all about sources and methods. The above review was written and edited entirely on an iPad with Apple's Pages application. I found this a wonderful way to occupy the time during an airline and train journey. Yes, the iPad on-screen keyboard can be irritating, yet even so I find researching and writing makes the time pass ever more quickly than just reading.

Posted at 00:46 Permalink

Saturday, September 22, 2012

Floating Point Benchmark: Haskell Language Added

I have posted an update to my trigonometry-intense floating point benchmark which adds Haskell to the list of languages in which the benchmark is implemented. A new release of the benchmark collection including Haskell is now available for downloading.

The Haskell benchmark was developed and tested on Glasgow Haskell Compiler version 7.4.1 on Ubuntu Linux 11.04; the relative performance of the various language implementations (with C taken as 1) is as follows. All implementations of the benchmark listed below produced identical results to the last (11th) decimal place.

Language Relative
Time
Details
C 1 GCC 3.2.3 -O3, Linux
Visual Basic .NET 0.866 All optimisations, Windows XP
FORTRAN 1.008 GNU Fortran (g77) 3.2.3 -O3, Linux
Pascal 1.027
1.077
Free Pascal 2.2.0 -O3, Linux
GNU Pascal 2.1 (GCC 2.95.2) -O3, Linux
Java 1.121 Sun JDK 1.5.0_04-b05, Linux
Visual Basic 6 1.132 All optimisations, Windows XP
Haskell 1.223 GHC 7.4.1-O2 -funbox-strict-fields, Linux
Ada 1.401 GNAT/GCC 3.4.4 -O3, Linux
Lisp 7.41
19.8
GNU Common Lisp 2.6.7, Compiled, Linux
GNU Common Lisp 2.6.7, Interpreted
Smalltalk 7.59 GNU Smalltalk 2.3.5, Linux
Python 17.6 Python 2.3.3 -OO, Linux
Perl 23.6 Perl v5.8.0, Linux
Ruby 26.1 Ruby 1.8.3, Linux
JavaScript 27.6
39.1
46.9
Opera 8.0, Linux
Internet Explorer 6.0.2900, Windows XP
Mozilla Firefox 1.0.6, Linux
QBasic 148.3 MS-DOS QBasic 1.1, Windows XP Console

Benchmarking in a purely functional language with lazy evaluation like Haskell presents formidable challenges and requires some judgement calls which do not occur in benchmarks of procedural languages (although similar situations may arise in procedural languages with strong optimisation of invariant results).

Due to Haskell's lazy evaluation, a straightforward port of the floating point benchmark code from a procedural language (I started with the Ada implementation for this project, since I considered it the cleanest and best documented, using data types much in the spirit of Haskell), will result in a benchmark program which runs in essentially constant time. Since only the last result of the many iterations through the benchmark computation actually contributes to output visible to the outside world, all of the earlier computations go ker-thunk into the memory hole and are never actually performed. “If they can't see it, why do it?” is the Haskell motto. I've had employees who work that way.

A wise manager once told me, “People don't do what you expect, but what you inspect.” So it is with Haskell. In order to get a fair benchmark against a procedural language, we must arrange that each of the requested iterations in the benchmark run actually be fully calculated. In the main fbench.hs, we accomplish this by two ruses. First of all, for all but the final ray trace we vary the clear aperture of the design by adding a pseudorandom value to it. (Since the ray tracing algorithm runs in constant time regardless of this parameter [or the rest of the design, for that matter], this does not affect the timing result.) Each ray trace is then forced to be evaluated by applying the “deepseq” function to its ultimate result. (You may have to install the deepseq package into your Haskell environment—it is not, for example, installed by default on Ubuntu Linux when you install the compiler, ghc.) Since all we care about is ensuring the clear aperture is different on each successive ray trace, we use a pseudorandom generator with a constant seed.

Reference results are produced by compiling fbench.hs into a binary executable (see the Makefile for details) and then running it with an iteration count (specified by the first argument on the command line) which results in a run time of around five minutes. If a positive iteration count is specified, a classic fbench run which expects manual timing will be done. If the iteration count is negative, the run will be in batch mode, intended to be timed by running under the Unix “time” utility or equivalent.

But note that we're doing additional work by generating the pseudorandom numbers and the clanking recursion machinery and consequent garbage collections. Haskell's default pseudorandom generator is famously slow, so we must be careful that we're measuring Haskell's performance on our own code, rather than the wrapper which invokes the generator. This is accomplished by the fbench_baseline.hs program. This uses precisely the same logic to run the benchmark, but performs a null computation in place of the ray trace. (I've left all the ray trace code in this program, in case you want to experiment with switching all or part of it back on.) To calculate the “true” run time of the benchmark on the ray tracing code, we take the run time of fbench.hs and subtract that of fbench_baseline.hs for the same number of iterations. In comparing this with the run time of procedural languages, we assume that the loop overhead of those languages wrapping iterations of the benchmark is negligible; this is almost always the case, but it is not so in Haskell, hence the need for this correction.

Haskell purists may object to this whole exercise and contend that the fairest benchmark is a straight port of the C, FORTRAN, etc. code, measuring it against those languages. I am sympathetic to this argument—after all, if one of the main design goals of the language is to allow factoring out redundant computations, isn't that worthy of being measured in a benchmark? On the other hand, the mission of fbench, since its origin in the 1980s, has been to measure the performance of trigonometric function intense code, and one hardly accomplishes that by running code where all but the last iteration of the benchmark is optimised out by the compiler. Understand—I am extremely impressed, if not in awe, of Haskell's optimisation, but I also want to get a result which people can compare against realistic code, not a contrived benchmark like this one which (in procedural language implementations) simply does the same computation over and over. If you want to experience the magic of lazy evaluation, build the fbench_pure.hs program. It eschews randomisation of the design being ray traced and forcing evaluation of the results, and consequently runs in near-constant time regardless of the iteration count requested. If you wish to explore other strategies for fairly benchmarking Haskell, it's probably best to start with fbench_pure.hs, to which you can add your own code to force evaluation as appropriate.

Finally, let me note that this is the first Haskell program I have ever written (I developed a deep appreciation for the language in the process, and I trust it shall not be my last). I have tried to do things “the Haskell way”, but I may have committed any number of beginner blunders which elicit gnashville sounds from the teeth of those fluent in the language. If so, please chastise me and offer suggestions as to how I might remedy the shortcomings you perceive in this code.

In this release, the C benchmarks, which were previously at the top level of the directory tree in the archive, have been moved into their own c subdirectory with its own Makefile. The C benchmark programs are unchanged.

Update: Benchmark results for Haskell revised to use timings with GHC 7.4.1 and Haskell platform 2012.2.0.0. Its results are substantially better than those of the 6.12.3 compiler I originally used. (2012-09-23 12:23 UTC)

Update: Benchmark results for Haskell revised to use timings with strict fields for primitive types and the -funbox-strict-fields compiler option, which almost doubled execution speed. Special thanks to Don Stewart for analysing the benchmark and recommending this. (2012-09-24 15:40 UTC)

Posted at 15:47 Permalink

Thursday, September 20, 2012

Reading List: Tubes

Blum, Andrew. Tubes. New York: HarperCollins, 2012. ISBN 978-0-06-199493-7.
The Internet has become a routine fixture in the lives of billions of people, the vast majority of whom have hardly any idea how it works or what physical infrastructure allows them to access and share information almost instantaneously around the globe, abolishing, in a sense, the very concept of distance. And yet the Internet exists—if it didn't, you wouldn't be able to read this. So, if it exists, where is it, and what is it made of?

In this book, the author embarks upon a quest to trace the Internet from that tangle of cables connected to the router behind his couch to the hardware which enables it to communicate with its peers worldwide. The metaphor of the Internet as a cloud—simultaneously everywhere and nowhere—has become commonplace, and yet as the author begins to dig into the details, he discovers the physical Internet is nothing like a cloud: it is remarkably centralised (a large Internet exchange or “peering location” will tend grow ever larger, since networks want to connect to a place where the greatest number of other networks connect), often grungy (when pulling fibre optic cables through century-old conduits beneath the streets of Manhattan, one's mind turns more to rats than clouds), and anything but decoupled from the details of geography (undersea cables must choose a route which minimises risk of breakage due to earthquakes and damage from ship anchors in shallow water, while taking the shortest route and connecting to the backbone at a location which will provide the lowest possible latency).

The author discovers that while much of the Internet's infrastructure is invisible to the layman, it is populated, for the most part, with people and organisations open and willing to show it off to visitors. As an amateur anthropologist, he surmises that to succeed in internetworking, those involved must necessarily be skilled in networking with one another. A visit to a NANOG gathering introduces him to this subculture and the retail politics of peering.

Finally, when non-technical people speak of “the Internet”, it isn't just the interconnectivity they're thinking of but also the data storage and computing resources accessible via the network. These also have a physical realisation in the form of huge data centres, sited based upon the availability of inexpensive electricity and cooling (a large data centre such as those operated by Google and Facebook may consume on the order of 50 megawatts of electricity and dissipate that amount of heat). While networking people tend to be gregarious bridge-builders, data centre managers view themselves as defenders of a fortress and closely guard the details of their operations from outside scrutiny. When Google was negotiating to acquire the site for their data centre in The Dalles, Oregon, they operated through an opaque front company called “Design LLC”, and required all parties to sign nondisclosure agreements. To this day, if you visit the facility, there's nothing to indicate it belongs to Google; on the second ring of perimeter fencing, there's a sign, in Gothic script, that says “voldemort industries”—don't be evil! (p. 242) (On p. 248 it is claimed that the data centre site is deliberately obscured in Google Maps. Maybe it once was, but as of this writing it is not. From above, apart from the impressive power substation, it looks no more exciting than a supermarket chain's warehouse hub.) The author finally arranges to cross the perimeter, get his retina scanned, and be taken on a walking tour around the buildings from the outside. To cap the visit, he is allowed inside to visit—the lunchroom. The food was excellent. He later visits Facebook's under-construction data centre in the area and encounters an entirely different culture, so perhaps not all data centres are Morlock territory.

The author comes across as a quintessential liberal arts major (which he was) who is alternately amused by the curious people he encounters who understand and work with actual things as opposed to words, and enthralled by the wonder of it all: transcending space and time, everywhere and nowhere, “free” services supported by tens of billions of dollars of power-gobbling, heat-belching infrastructure—oh, wow! He is also a New York collectivist whose knee-jerk reaction is “public, good; private, bad” (notwithstanding that the build-out of the Internet has been almost exclusively a private sector endeavour). He waxes poetic about the city-sponsored (paid for by grants funded by federal and state taxpayers plus loans) fibre network that The Dalles installed which, he claims, lured Google to site its data centre there. The slightest acquaintance with economics or, for that matter, arithmetic, demonstrates the absurdity of this. If you're looking for a site for a multi-billion dollar data centre, what matters is the cost of electricity and the climate (which determines cooling expenses). Compared to the price tag for the equipment inside the buildings, the cost of running a few (or a few dozen) kilometres of fibre is lost in the round-off. In fact, we know, from p. 235 that the 27 kilometre city fibre run cost US$1.8 million, while Google's investment in the data centre is several billion dollars.

These quibbles aside, this is a fascinating look at the physical substrate of the Internet. Even software people well-acquainted with the intricacies of TCP/IP may have only the fuzziest comprehension of where a packet goes after it leaves their site, and how it gets to the ultimate destination. This book provides a tour, accessible to all readers, of where the Internet comes together, and how counterintuitive its physical realisation is compared to how we think of it logically.

In the Kindle edition, end-notes are bidirectionally linked to the text, but the index is just a list of page numbers. Since the Kindle edition does include real page numbers, you can type in the number from the index, but that's hardly as convenient as books where items in the index are directly linked to the text. Citations of Internet documents in the end notes are given as URLs, but not linked; the reader must copy and paste them into a browser's address bar in order to access the documents.

Posted at 21:47 Permalink

Sunday, September 9, 2012

Reading List: Foreign Enemies and Traitors

Bracken, Matthew. Foreign Enemies and Traitors. Orange Park, FL: Steelcutter Publishing, 2009. ISBN 978-0-9728310-3-1.
This is the third novel in the author's “Enemies” trilogy, which began with Enemies Foreign and Domestic (December 2009), and continued with Domestic Enemies (March 2012). Here, we pick up the story three years after the conclusion of the second volume. Phil Carson, who we last encountered escaping from the tottering U.S. on a sailboat after his involvement in a low-intensity civil war in Virginia, is returning to the ambiguously independent Republic of Texas, smuggling contraband no longer available in the de-industrialised and bankrupt former superpower, when he is caught in a freak December hurricane in the Gulf of Mexico and shipwrecked on the coast of Mississippi.

This is not the America he left. The South is effectively under martial law, administered by General Marcus Aurelius Mirabeau; east Texas has declared its independence; the Southwest has split off as Aztlan and secured autonomy in the new Constitution; the East and upper Midwest remain under the control of the ever more obviously socialist regime in Washington; and the American redoubt states in the inland northwest are the last vestige of liberty. The former United States have not only been devastated by economic collapse and civil strife stemming from the attempt to ban and confiscate weapons, but then ravaged by three disastrous hurricanes and two earthquakes on the New Madrid fault. It's as if God had turned his back on the United States of America—say “no” to Him three times, and that may happen.

Carson, a Vietnam special forces veteran, uses his skills at survival, evasion, and escape, as well as his native cunning, to escape (albeit very painfully) to Tennessee, which is in the midst of a civil war. Residents, rejecting attempts to disarm them (which would place them at risk of annihilation at the hands of the “golden horde” escaping devastated urban areas and ravaging everything in their path), are now confronted with foreign mercenaries from such exemplars of human rights and rule of law as Kazakhstan and Nigeria, brought in because U.S. troops have been found too squeamish when it come to firing on their compatriots: Kazakhstani cavalry—not so much. (In the book, these savages are referred to as “Kazaks”. “Kazakhstani” is correct, but as an abbreviation I think “Kazakh” [the name of their language] would be better.)

Carson, and the insurgents with whom he makes contact in Tennessee, come across incontrovertible evidence of an atrocity committed by Kazakhstani mercenaries, at the direction of the highest levels of what remains of the U.S. government. In a world with the media under the thumb of the regime and the free Internet a thing of the past, getting this information out requires the boldest of initiatives, and recruiting not just career NCOs, the backbone of the military, but also senior officers with the access to carry out the mission. After finishing this book, you may lose some sleep pondering the question, “At what point is a military coup the best achievable outcome?”.

This is a thoroughly satisfying conclusion to the “Enemies” trilogy. Unlike the previous volumes, there are a number of lengthy passages, usually couched as one character filling in another about events of which they were unaware, which sketch the back story. These are nowhere near as long as Galt's speech in Atlas Shrugged (April 2010), (which didn't bother me in the least—I thought it brilliant all of the three times I've read it), but they do ask the reader to kick back from the action and review how we got here and what was happening offstage. Despite the effort to make this book work as a stand-alone novel, I'd recommend reading the trilogy in series—if you don't you'll miss the interactions between the characters, how they came to be here, and why the fate of the odious Bob Bullard is more than justified.

Extended excerpts of this and the author's other novels are available online at the author's Web site.

Posted at 22:36 Permalink