Category: Nuts & Bolts

How to Read a Large Amount of Information

The first part of this article can be accessed as libre open access, the second part is exclusively for members and registered participants to our courses.

The incredible and growing amount of information available nowadays presents us with specific challenges we need to overcome first, if we want to be able to understand, foresee, warn about, and finally adequately answer accumulating dangers, threats, risks or more broadly changes and uncertainties. Our information age is indeed characterised by what Martin Hilbert called the “global information explosion” (“Digital Technology & Social Change” University of California Course, 2015), when we constantly face “information overload” (among many others, Bertram Gross, The Managing of Organizations, 1964; Alvin Toffler, Future Shock, 1970; also Stanley Milgram, “The experience of living in cities“, Science, 167, 1461-1468, 1970).

Google estimated in 2010 that 129,864,880 books had then been published (Leonid Taycher, “Books of the world, stand up and be counted! All 129,864,880 of you.” 5 Aug 2010). Wikipedia estimates that “approximately 2,200,000” books were published each year across the world. Meanwhile, it is almost frightening to look live at the constantly growing number of internet website: 1,080,387,230+ on 15 Sept 2016 (internet live stats). 

Those are general figures, but they are also representative of what we must face when we work on a specific topic, because we have to Continue reading How to Read a Large Amount of Information

The Red (Team) Analysis Weekly 157 – Information Wars

Editorial – Information Wars – Information or more broadly belief-based wars seem to multiply right now, relayed by many official declarations, articles and analyses, although fortunately not all.  This is a worrying phenomenon because it leads to direct polarization (enhancing feelings of threat, fear, “all because of an evil other that must be fought”) and to inaccurate analyses, which in turn also fuel polarization.

Information wars: propaganda, biases and conspiracy theories

We can see this phenomenon at work regarding Ukraine, Iraq, or, in a lesser way because the spotlight is not right now directed at this issue, China and the various disputes in the East and South China Seas. In Iraq, the way the al-Maliki government accuses Saudi Arabia to support ISIS, when actually a more objective but also complex reality (see among others Elizabeth Dickinson, “ISIS advance in Iraq forces Gulf donors to rethink their patronageCSMonitor) is at work is a case in point. We can also see it at work in this fascinating video (see it also here) of Sunday Times journalist Mark Franchetti describing his experience and work in Eastern Ukraine, underlining that people who end up being targeted by the Ukrainian Anti-Terrorist Operation are in majority neither terrorists nor Russian troops but everyday Ukrainians to a shocked public on a live show on Ukrainian TV. This video seems to be true and neither a hoax nor crafted propaganda, all the more so that its content fits with many other accounts and videos.

Most of the time when thinking about information wars, we associate it with ideology and propaganda, usually used by “the other side” (the enemy) and that must be fought as we (our side, the “good guys”) might be deceived. It is less frequently associated with biases, i.e. “cognitive errors”, that may impact each and every person and bend their understanding of a situation. In the first case there is a purpose, an intention to deceive. In the second case, there is none, but a recognition of our fragilities and imperfections. Actually, the first one may sometimes be engineered, while the second is always at work. However, and in an often confusing way, unintentionally (false or not fully true) held beliefs are often then spread intentionally as the real truth, and thus perceived by others, either more objective or with different beliefs,  as propaganda.

Interestingly, Richard Evans just published an excellent article (“9/11, Moon landings, JFK assassination: conspiracy theories follow a deep pattern“) on Conspiracy theories related to the research programme he leads at the University of Cambridge (UK), Conspiracy and Democracy, and reflecting how people can become prey to biases, without any propaganda being at work. Trust or rather lack thereof, as he emphasizes, is an important element in favouring conspiracy theories, as well as other biases as described by Heuer (Psychology of Intelligence Analysis). Evans, in the examples he chooses, also points out that shocking events, and highly tense and violent periods seem to be a favourable ground for the spread of conspiracy theories. Thus, considering the current high level of tension worldwide we should not be surprised to see untrue, unreal or ideological beliefs spreading. Indeed there is not such a difference between inexact beliefs held (whether spread intentionally or not) and conspiracy theories.

How can we actively struggle against biases and potential propaganda?

The traditional way of intelligence services evaluating information, according to the value of the source, then of the piece of information, should be applied.

We should absolutely try to use falsification rather than confirmation when testing mentally our hypotheses (see “Useful rules for foresight  from Taleb’s Black Swan” for more details). Indeed, one of the force of conspiracy theories is that it cannot be falsified, while any element or fact becomes “circumstancial evidence”, as very well emphasized by The Interpreter: “The inevitable problem of trying to pin down specific evidence of Russian involvement is that Russia’s highly competent intelligence services are unlikely to leave overt traces. Instead we must work with heavy circumstantial evidence.” In other words, for the author it seems that the less evidence the more we can be certain there is Russian involvement because we are dealing with highly efficient secret service.  Here, there is no possible falsification, but only confirmation, which takes us from the realm of science and rationality towards faith and beliefs, emotions as well as collective psychology. This emphasizes again the importance of trust as underlined by Evans.

Finally, we need to put our hypotheses and understanding through the test of various red teaming methodologies, as explained among others by the excellent Red Team journal and redteams.net, and as we endeavour to do here.

Click on the image below to read the Weekly

information wars

Featured image: Teheran US embassy propaganda gun – After the Iranian hostage crisis (1979-1981), the walls of the former US embassy were covered in mostly anti-US-murals by Phillip Maiwald (Nikopol) [GFDL (http://www.gnu.org/copyleft/fdl.html) or CC-BY-SA-3.0-2.5-2.0-1.0 (http://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons.

When Risk Management Meets Strategic Foresight and Warning

Risk management, codified by the International Organization for Standardization (ISO), allows since 2009 for an almost perfect correspondence with the ideal-type process of strategic foresight and warning (SF&W), as we use here, even though SF&W was developed mainly out of public service – notably intelligence and defense – practice and experience, and with international and national security issues in mind.

CIA NYSE scThe new risk management process thus lays the foundation for easily incorporating geopolitical and other national and international security issues within risks usually managed by businesses, and should facilitate discussions and exchanges between the corporate world and the public sector, including in terms of data, information, and analysis, according to the specificities and strength of each.

We shall here detail the risk management process, Continue reading When Risk Management Meets Strategic Foresight and Warning

Intelligence, Strategic Foresight and Warning, Risk Management, Forecasting or Futurism?

Our focus is usually strategic foresight and warning (SF&W) for national security, the latter being understood in terms of traditional and non-traditional security issues, or, to use a military approach, in terms of conventional and unconventional security.[1] Building upon Fingar, Davis, Grabo and Knight, we define it as “an organized and systematic process to reduce uncertainty regarding the future that aims at allowing policy-makers and decision-makers to take decisions with sufficient lead time to see those decisions implemented at best.”

Broadly speaking, it is part of the field of anticipation – or approaches to the future, which also includes other perspectives and practices centered on other themes. Continue reading Intelligence, Strategic Foresight and Warning, Risk Management, Forecasting or Futurism?

Useful Rules for Foresight from Taleb’s The Black Swan

This second post on The Black Swan: the impact of the highly improbable by Nassim Nicholas Taleb emphasises some of the author’s points that could be useful to foresight and warning and all “predictive work. Many of those themes are not really new, but already integrated in F&W and, more broadly, analysis. Nonetheless, it is always useful to underline them, as it is so easy to forget best practice. (The first post can be accessed here).

Humility

humility, doubt(Notably pp.190-200) Considering uncertainty, but also our imperfect condition of human beings, the complexity of the social world, feedbacks, our more than insufficient knowledge and understanding, we must be very humble, accept our partial ignorance, our imperfection and mistakes (and make sure those essentially human flaws are accepted by others, which may be more difficult). Yet, we must also struggle to improve ourselves, increase our understanding and our capability to foresee the future. Doubts, humility, real dialogue between those different communities which try to understand the world, reflection upon mistakes – to correct what can be identified as wrong or inefficient – and successes – to reproduce what worked (according to conditions) – are keys for this improvement.

Taleb’s use of and reference to Montaigne’s wisdom also points to the importance of struggling against the loss of memory – institutional, scientific and general – that plagues us. Some things that were understood in the past are now misconstrued or ignored. It would appear, sometimes, that we are part of a race where youth, novelty, fads and shortening of time rule as masters. Yet, shouldn’t we pause for a while and wonder about this behaviour, and its origin. Should we not question the results stemming from this new race forward? For example, in science (soft and hard), it is not because something has been understood, discovered or written decades or centuries ago that it has become wrong. On the contrary, good science starts with knowledge and understanding of past scientific discoveries. Some understandings are outdated, but some are not. Novelty and justness of analysis are not synonymous, while discarding all the past only makes us lose time. Consumerism cannot and should not be applied everywhere.

“Black Swans events” (unpredictability, outliers)

As underlined last week, Taleb makes a distinction between “Mandelbrotian Gray Swans” (rare but expected event that are scientifically tractable, pp. 37, 272-273) and real “Black Swans events,” which are never identified in advance. From that we could make the following “rules”:

  • Making swans gray

Try to imagine as many improbable events as possible, initially suspending disbelief. This is already done; methods, however tentative, exist: wild cards scenarios (e.g. James A. Dewar, “The Importance of “Wild Card” Scenarios”); brainstorming; what if stories and narratives; use of alternative thinkers and thinking.

innovative idea, dangerous idea, wild card, gray swan

The key, here, is imagination and allowing oneself to go beyond groupthink, norms (institutional, social, cultural), belief-systems, even if ideas may feel dangerous (read for example “In defense of dangerous ideas” by Steven Pinker, Harvard college professor, cognitive scientist, July 2007). Then, and as suggested and explained by Dewar, because resources are limited and also because even Swans have to follow a few rules, those potential “gray swans” should be examined in the light of all the other rules. The least likely (or the most absurd) should be discarded. For example, we may always assume that gravity on earth could disappear, or that lambs will become carnivorous, yet, the chances are so minimal that we may choose to dispense with these situations. For those events that remain on the gray swans list, potential impacts can be estimated and highly improbable-high impact scenarios developed.

  • The absence of certainty

Because we may assume that the likelihood of the existence of Black Swans is very high, then we must consider them. This will influence our estimation of probability. We may just forget certainty. This may look like nothing, but I suspect that in the world of security and politics where the issues of power and control – including in personal terms – are so crucial, truly accepting uncertainty and insecurity is a major effort.

Continuing the struggle against biases

(pp.1-164) Cognitive biases being a fatality of human beings, the least we can do is being aware of them, and persist in our struggle against them. Using our increasing knowledge and awareness of cognition, we may continue applying and creating specific training and systematically incorporate related safeguards in methodologies and processes, from organization (for example people joining the exercise at different stage) to teams-composition (people with different background, psychological makeup, etc.).

Meanwhile, introspection and reflection should be promoted by and for those who deal with foresight, forecast and prediction, as, exemplified (here on the question of ethics and potential biases induced by “conflict of interests”) by this very recent post by Jay Ulfelder. “Introspective phases” could and should be included at different stages of the foresight process.

Opium, Dutch East Indies, anachronistic projectionThose phases should notably fight against the known phenomena of anachronistic projections and cultural projections. Anachronistic projections are usually done with regard to past history (judging past actions from the point of view of today’s moral norms; understanding the past through today’s lenses), which obscures understanding. For example, we currently struggle against drug trafficking, notably because this endangers our societies and is seen as morally bad. Yet, opium has been an accepted state activity at least from the 19th century to well into the 20th century. This does not mean going backwards in terms of norms or accepting things that are seen as morally reprehensible or are damaging or hurtful, but would help locating phenomena in their proper context, and thus focusing on dynamics, processes and understanding. This would also (ideally) help us move from an attitude that favours judging and casting blame, with all the power struggles and violence that this implies, towards a much more constructive behaviour, promoting understanding, preventing and healing.

A political scientist* gives somewhere a great trick that we could usefully apply: if you read (we can change it to think/say) somewhere the word “always,” then stop and think.

Not being prey to anachronistic projections would imply considering too the evolution of ideas and norms and setting time-dependent struggles within historical processes. Coming back to foresight, anachronistic projections may as well be done regarding the future. What does this imply? Can we devise methods to try minimizing them? How can we best proceed to include ideas, norms, and beliefs in our models?

Cultural projections are even better known and may be easier to consider. Not falling prey to them will demand knowing our own cultural sets of norms on top of, or even prior to, those of others (e.g., in the anticipation field, Werther, 2008). Just asking ourselves this question during foresight exercises could improve results. Similarly we must struggle against being victims of ideology. This does not mean rejecting this or that belief, just being aware of what influences us.

Finally, the impact of emotions on our cognition, emotional biases, should be considered, as human beings are definitely not rational, emotionless beings.

Falsification rather than confirmation

(Notably pp.55-61) The risks of induction, which are so important to us because so many of our analyses come from collected evidence, are linked to our tendency to seek confirmation rather than falsification (looking for an element, a fact, an event that would prove our hypothesis or explanation wrong). All our analyses – this is valid for all our explanations and understanding, not only for foresight and warning – should include an effort at falsification, however without denying confirming facts. We should always wonder:  which evidence, fact, should I look for to disprove my theory, analysis, estimation, conjecture?

Furthermore, this effort should be mentioned in the final anticipatory product (potentially in an appendix, according the specificities demanded by the delivery needs of the customer) to allow for follow-up and update. Meanwhile, indicators, specifically designed for falsification, should be created. For example, in foresight, if we have concurrent scenarios, the happenstance of an event or any indication showing that one scenario is becoming less likely should be considered and the set of scenarios should be revised accordingly.

Careful causality: “silent evidence”

silence, National Security, silent evidence, causality

(pp.100-121) Taleb starts first by cautioning against the dangers of applying causality when there is none or when “silent evidence” could potentially distort causal reasoning. “Silent evidence” is what we don’t know, cannot know, cannot hear, do not hear. To explain “silent evidence”, Taleb uses Cicero’s story: “Diagoras, a nonbeliever in the Gods, was shown painted tablets bearing the portraits of some worshippers who prayed, then survived a subsequent shipwreck. The implication was that praying protects you from drowning. Diagoras asked, ‘Where were the pictures of those who prayed, then drowned?’” Taleb, however, does not reject causality but encourages to use it with care and caution, trying to think about the possibility of existence of “silent evidence”.

Creation of new adapted quantitative tools

(The whole book) Except in the cases when they can be applied, the author warns us to distrust correlations, which furthermore do not allow for understanding (according to the famous “correlation does not imply causation“), linear trends and Gaussian distribution. Instead, research involving fractals and scalability, complexity theory (as done for example by the New England Complex Systems Institute) should be favoured. Building upon this, we can imagine creating new tools allowing for multi-disciplinary research, articulating, when necessary (there is no need to use something very complicated if a simple analysis or logic are sufficient), complex modeling, agent-based models, and mixing quantitative and qualitative assessments (notably including processes and feedbacks), and integrating within foresight methodologies. What should guide us is always the issue or the problem at hand.

* Unfortunately I cannot remember who wrote this, nor in which book or article. Initially I thought it was Benedict Anderson in Imagined Communities but after having skimmed again through the book, I cannot locate the citation, thus it could be Anthony Smith, or Eric Hobsbawm, or… if anyone knows, I would welcome the reference!

————-

References

Dewar, James A. “The Importance of “Wild Card” Scenarios,” Discussion Paper, RAND.

Pinker, Steven, “In defense of dangerous ideas”, July 2007.

Ulfelder, Jay, “Advocascience“, 27 January 2013, Dart-Throwing Chimp.

Werther, Guntram F. A., Holistic Integrative Analysis of International Change: A Commentary on Teaching Emergent Futures, The Proteus Monograph Series, Volume 1, Issue 3, January 2008.