This second post on The Black Swan: the impact of the highly improbable by Nassim Nicholas Taleb emphasises some of the author’s points that could be useful to foresight and warning and all “predictive work. Many of those themes are not really new, but already integrated in F&W and, more broadly, analysis. Nonetheless, it is always useful to underline them, as it is so easy to forget best practice. (The first post can be accessed here).

Humility

humility, doubt(Notably pp.190-200) Considering uncertainty, but also our imperfect condition of human beings, the complexity of the social world, feedbacks, our more than insufficient knowledge and understanding, we must be very humble, accept our partial ignorance, our imperfection and mistakes (and make sure those essentially human flaws are accepted by others, which may be more difficult). Yet, we must also struggle to improve ourselves, increase our understanding and our capability to foresee the future. Doubts, humility, real dialogue between those different communities which try to understand the world, reflection upon mistakes – to correct what can be identified as wrong or inefficient – and successes – to reproduce what worked (according to conditions) – are keys for this improvement.

Taleb’s use of and reference to Montaigne’s wisdom also points to the importance of struggling against the loss of memory – institutional, scientific and general – that plagues us. Some things that were understood in the past are now misconstrued or ignored. It would appear, sometimes, that we are part of a race where youth, novelty, fads and shortening of time rule as masters. Yet, shouldn’t we pause for a while and wonder about this behaviour, and its origin. Should we not question the results stemming from this new race forward? For example, in science (soft and hard), it is not because something has been understood, discovered or written decades or centuries ago that it has become wrong. On the contrary, good science starts with knowledge and understanding of past scientific discoveries. Some understandings are outdated, but some are not. Novelty and justness of analysis are not synonymous, while discarding all the past only makes us lose time. Consumerism cannot and should not be applied everywhere.

“Black Swans events” (unpredictability, outliers)

As underlined last week, Taleb makes a distinction between “Mandelbrotian Gray Swans” (rare but expected event that are scientifically tractable, pp. 37, 272-273) and real “Black Swans events,” which are never identified in advance. From that we could make the following “rules”:

  • Making swans gray

Try to imagine as many improbable events as possible, initially suspending disbelief. This is already done; methods, however tentative, exist: wild cards scenarios (e.g. James A. Dewar, “The Importance of “Wild Card” Scenarios”); brainstorming; what if stories and narratives; use of alternative thinkers and thinking.

innovative idea, dangerous idea, wild card, gray swan

The key, here, is imagination and allowing oneself to go beyond groupthink, norms (institutional, social, cultural), belief-systems, even if ideas may feel dangerous (read for example “In defense of dangerous ideas” by Steven Pinker, Harvard college professor, cognitive scientist, July 2007). Then, and as suggested and explained by Dewar, because resources are limited and also because even Swans have to follow a few rules, those potential “gray swans” should be examined in the light of all the other rules. The least likely (or the most absurd) should be discarded. For example, we may always assume that gravity on earth could disappear, or that lambs will become carnivorous, yet, the chances are so minimal that we may choose to dispense with these situations. For those events that remain on the gray swans list, potential impacts can be estimated and highly improbable-high impact scenarios developed.

  • The absence of certainty

Because we may assume that the likelihood of the existence of Black Swans is very high, then we must consider them. This will influence our estimation of probability. We may just forget certainty. This may look like nothing, but I suspect that in the world of security and politics where the issues of power and control – including in personal terms – are so crucial, truly accepting uncertainty and insecurity is a major effort.

Continuing the struggle against biases

(pp.1-164) Cognitive biases being a fatality of human beings, the least we can do is being aware of them, and persist in our struggle against them. Using our increasing knowledge and awareness of cognition, we may continue applying and creating specific training and systematically incorporate related safeguards in methodologies and processes, from organization (for example people joining the exercise at different stage) to teams-composition (people with different background, psychological makeup, etc.).

Meanwhile, introspection and reflection should be promoted by and for those who deal with foresight, forecast and prediction, as, exemplified (here on the question of ethics and potential biases induced by “conflict of interests”) by this very recent post by Jay Ulfelder. “Introspective phases” could and should be included at different stages of the foresight process.

Opium, Dutch East Indies, anachronistic projectionThose phases should notably fight against the known phenomena of anachronistic projections and cultural projections. Anachronistic projections are usually done with regard to past history (judging past actions from the point of view of today’s moral norms; understanding the past through today’s lenses), which obscures understanding. For example, we currently struggle against drug trafficking, notably because this endangers our societies and is seen as morally bad. Yet, opium has been an accepted state activity at least from the 19th century to well into the 20th century. This does not mean going backwards in terms of norms or accepting things that are seen as morally reprehensible or are damaging or hurtful, but would help locating phenomena in their proper context, and thus focusing on dynamics, processes and understanding. This would also (ideally) help us move from an attitude that favours judging and casting blame, with all the power struggles and violence that this implies, towards a much more constructive behaviour, promoting understanding, preventing and healing.

A political scientist* gives somewhere a great trick that we could usefully apply: if you read (we can change it to think/say) somewhere the word “always,” then stop and think.

Not being prey to anachronistic projections would imply considering too the evolution of ideas and norms and setting time-dependent struggles within historical processes. Coming back to foresight, anachronistic projections may as well be done regarding the future. What does this imply? Can we devise methods to try minimizing them? How can we best proceed to include ideas, norms, and beliefs in our models?

Cultural projections are even better known and may be easier to consider. Not falling prey to them will demand knowing our own cultural sets of norms on top of, or even prior to, those of others (e.g., in the anticipation field, Werther, 2008). Just asking ourselves this question during foresight exercises could improve results. Similarly we must struggle against being victims of ideology. This does not mean rejecting this or that belief, just being aware of what influences us.

Finally, the impact of emotions on our cognition, emotional biases, should be considered, as human beings are definitely not rational, emotionless beings.

Falsification rather than confirmation

(Notably pp.55-61) The risks of induction, which are so important to us because so many of our analyses come from collected evidence, are linked to our tendency to seek confirmation rather than falsification (looking for an element, a fact, an event that would prove our hypothesis or explanation wrong). All our analyses – this is valid for all our explanations and understanding, not only for foresight and warning – should include an effort at falsification, however without denying confirming facts. We should always wonder:  which evidence, fact, should I look for to disprove my theory, analysis, estimation, conjecture?

Furthermore, this effort should be mentioned in the final anticipatory product (potentially in an appendix, according the specificities demanded by the delivery needs of the customer) to allow for follow-up and update. Meanwhile, indicators, specifically designed for falsification, should be created. For example, in foresight, if we have concurrent scenarios, the happenstance of an event or any indication showing that one scenario is becoming less likely should be considered and the set of scenarios should be revised accordingly.

Careful causality: “silent evidence”

silence, National Security, silent evidence, causality

(pp.100-121) Taleb starts first by cautioning against the dangers of applying causality when there is none or when “silent evidence” could potentially distort causal reasoning. “Silent evidence” is what we don’t know, cannot know, cannot hear, do not hear. To explain “silent evidence”, Taleb uses Cicero’s story: “Diagoras, a nonbeliever in the Gods, was shown painted tablets bearing the portraits of some worshippers who prayed, then survived a subsequent shipwreck. The implication was that praying protects you from drowning. Diagoras asked, ‘Where were the pictures of those who prayed, then drowned?’” Taleb, however, does not reject causality but encourages to use it with care and caution, trying to think about the possibility of existence of “silent evidence”.

Creation of new adapted quantitative tools

(The whole book) Except in the cases when they can be applied, the author warns us to distrust correlations, which furthermore do not allow for understanding (according to the famous “correlation does not imply causation“), linear trends and Gaussian distribution. Instead, research involving fractals and scalability, complexity theory (as done for example by the New England Complex Systems Institute) should be favoured. Building upon this, we can imagine creating new tools allowing for multi-disciplinary research, articulating, when necessary (there is no need to use something very complicated if a simple analysis or logic are sufficient), complex modeling, agent-based models, and mixing quantitative and qualitative assessments (notably including processes and feedbacks), and integrating within foresight methodologies. What should guide us is always the issue or the problem at hand.

* Unfortunately I cannot remember who wrote this, nor in which book or article. Initially I thought it was Benedict Anderson in Imagined Communities but after having skimmed again through the book, I cannot locate the citation, thus it could be Anthony Smith, or Eric Hobsbawm, or… if anyone knows, I would welcome the reference!

————-

References

Dewar, James A. “The Importance of “Wild Card” Scenarios,” Discussion Paper, RAND.

Pinker, Steven, “In defense of dangerous ideas”, July 2007.

Ulfelder, Jay, “Advocascience“, 27 January 2013, Dart-Throwing Chimp.

Werther, Guntram F. A., Holistic Integrative Analysis of International Change: A Commentary on Teaching Emergent Futures, The Proteus Monograph Series, Volume 1, Issue 3, January 2008.

2 thoughts on “Useful Rules for Foresight from Taleb’s The Black Swan”

Comments are closed.