Seeing impending doom, and following up
WARNINGS: FINDING CASSANDRAS TO STOP CATASTROPHES By Richard A. Clarke and R.P. Eddy Ecco, $29.99, 416 pages
This book is about the capability to forecast future trends, particularly impending disasters, in spite of conventional wisdom’s usual dismissal of such warnings, which is part of what is termed the Cassandra complex. Cassandra, the authors explain, was a Greek princess who was endowed with “the ability to see impending doom, but the inability to persuade anyone to believe in her.”
The authors are well qualified to write about decisions at the highest level. Richard Clarke is a veteran national security expert in the U.S. government and White House, with his co-author, R.P. Eddy, the CEO of Ergo, a business intelligence firm, based in New York.
To detect the presence of “a real Cassandra among the myriad of pundits” in today’s world, the authors present short case studies of experts who had exhibited a Cassandra-like ability concerning important disasters in their respective fields, but were ignored.
These case studies include the failures to follow up by the first Bush administration on warnings by a CIA National Intelligence Officer about Iraq’s impending invasion of Kuwait in early August 1991; the failure by federal emergency management agencies to listen to warnings by civil engineering professors at Louisiana State University’s Hurricane Center about the devastating damage that could be caused by a massive hurricane, which as was the case with August 2005’s Hurricane Katrina; the failures by Japanese nuclear authorities to heed the warnings of the director of Japan’s Active Fault and Earthquake Research Center that a Fukushima nuclear disaster was possible (with this nuclear disaster occurring in March 2011); and the failures by Wall Street regulatory bodies to act on the warnings by several Wall Street financial analysts that a financial meltdown was imminent (which came to fruition in September 2008).
In another case study, on the rise of ISIS, the authors highlight the role of former U.S. ambassador to Syria, Robert Ford, as the Cassandra in late 2012 who advocated American arming of the non-jihadist Free Syrian Army (FSA) Syrian opposition, which the authors claim would have prevented the taking over of the anti-Assad opposition by ISIS. While one may not agree with this assessment because the situation in Syria was much more complicated than they portray — for instance, the FSA included substantial elements of the Syrian Muslim Brotherhood, as opposed to hoped-for more secular and democratic elements — this chapter is still worth reading for its discussion of what Ambassador Ford had recommended and how it was met with resistance by the Obama administration.
To remedy the tendency of “conventional wisdom” to dismiss the warnings by such Cassandras, the authors propose to empower others with the capability to forecast impending disasters through what they term the “Cassandra Coefficient,” which consists of a series of questions derived from their observation of past Cassandra events.
As they explain, “It involves four components: (1) the warning, the threat, or risk in question, (2) the decision-makers or audience, who must react, (3) the predictor or possible Cassandra, and (4) the critics who disparage or reject the warning.” Each of these four components is accompanied by several factors that characterize them, such as “erroneous consensus” for the warning, or “ideological response rejection” for the decision-making, which add up to 24 factors in total. These 24 factors enable the user to assign the four components a score of high, medium, low or absent in order to ascertain the likelihood of a Cassandra-type warning to be accepted or not.
Once a Cassandra prediction is proposed and, following appropriate persuasion — a necessary element for a warning to be accepted by decision-makers — the authors turn their analysis to the response component, which they explain needs to involve an intelligence type indications and warning surveillance strategy, hedging, mitigation and prevention.
This methodology is then applied to the book’s remaining chapters, with case studies on the potential for artificial intelligence breakthroughs, if not properly controlled, in the words of Elon Musk (the founder of Tesla Motors), to pose humanity’s “biggest existential threat,” and for the interconnectedness of the “Internet of Things” to be vulnerable to what the authors term “universal hackability.”
The authors conclude that such an early warning system, based on the “Cassandra Coefficient” methodology, needs to be institutionalized in government (such as at the White House and Cabinet departments), the corporate world, and other sectors, to be pre-emptive in anticipating and preventing future disasters from occurring in crucial spheres affecting society. This is urgent, as they write that “we must systematically identify the people who see the risks first, test what these potential Cassandras are saying, then make transparent and explicit decisions about how to deal with the risk.” Otherwise, they caution, “Given the potential risks we will be facing in this century, the costs of not facing them in time will be unprecedented. Thus, it is important to take the time to listen for Cassandra. Can you hear her?”
The pervasive and continuous turbulence that are characterizing our current geopolitical and technological world make this book essential reading for understanding how we can begin to take the steps necessary to prevent further turbulence and disasters.