Before the COVID-19 crisis erupted, several signals warning of a possible pandemic had been emitted. Certain scientific studies, some of them published in prestigious journals , reported the risk of zoonotic disease outbreaks and propagation, notably linked to SARS type viruses, as well as their potential origins. Examples of such articles include this one published in Nature in 2008, this one in Nature Medicine in 2015, and this one that appeared in Proceedings of the National Academy of Sciences in 2016.
As also pointed out in certain recent articles (here or here), back in 2015 Microsoft founder Bill Gates – whose foundation is currently backing the development of a COVID-19 vaccine – had warned about the risk of a coming pandemic and the need to prepare for it. Likewise, a National Intelligence Council report from 2017 entitled “Global Trends – Paradox of Progress” put forward the scenario of a pandemic in 2023.
How then to explain that this information, although clear, did not lead to a better anticipation of the current crisis? Several barriers can explain this.
Barriers to weak signals
In the 1970s, the Russian-American professor and corporate strategy consultant Igor Ansoff suggested that weak signals had to pass through three filters – Information (surveillance /observation), mentality (interpretation) and power (decision) – before potentially inducing action.
The information filter corresponds to the ability of the weak signal to be detected or discovered by one or several actors within the organisation, amid all the other information perceived.
The mentality filter refers to the ability of the signal to be recognised as information that is relevant to the current situation once it has been detected. A number of cognitive biases can explain why this information is dismissed or overlooked (normalcy bias, confirmation bias, optimism bias, etc.).
These biases (recently discussed here and here) are individual, but other factors, organisational this time, can also help to better understand why these signals are ignored: the groupthink phenomenon or the organisational values and culture, for example.
Finally, the power filter informs the decision making once the signal has been detected and its relevance recognised. The people in charge within the organisation, those with decision-making power, can choose not to make this signal a priority despite the underlying risk.
More recently, a fourth filter has been discussed: transmission. It refers to the flow of information within the organisation and it is situated between the mentality and power filters. Indeed, the people who detect the signal and are the first to form an opinion on what it means are not generally those with the power to decide on whether or not to act.
In all likelihood, the different signals relating to what, at that time, was still a possible pandemic caused by a SARS-type virus did not manage to make it through these filters.
Better integration of weak signals
What strategies could possibly lead to a better identification and integration of weak signals in risk governance?
Several avenues from the field of strategic management could be helpful. First of all, it seems necessary for both decision-making and non decision-making actors to acknowledge the existence of the organisational frame of reference. It is often from this frame of reference that the filters of weak signals emerge.
Practising strategic dialogue and involving multiple internal and external actors with different frames of reference could help to reveal and challenge this dominant model and its basic assumptions.
This practice is used by the Chemistry Industry Association of Canada (CIAC), the organisation behind the Responsible Care initiative created in 1985 and now recognised by the United Nations as a standard for the safe, responsible and sustainable management of chemical products.
Twice a year, a National Advisory Panel composed of 13 members, including academics, environmental leaders and community members, meets to alert the Association of emerging issues and to question and rethink its risk management practices.
This example also illustrates the importance of a greater integration of networks, something we are currently investigating. By involving these other entities to which the organisation is connected directly or indirectly (academia, businesses, government departments, communities, citizen groups, etc.), it is possible to monitor developments outside the organisational frame of reference and improve the sharing of information. This strategy necessarily implies trusting these networks, otherwise it would remain ineffective.
Another strategy is scenario planning. This collaborative approach based on narratives makes it possible to explore risks and consider plausible rather than probable futures. This practice, which challenges what is usually assumed, is especially helpful for integrating uncertainties and ambiguities and for making sense of complex situations.
In the specific context of crisis management, and in the face of the uncertainty associated with climate change, scenario planning is increasingly being used to guide the development of public policies pertaining to the conservation or restoration of natural ecosystems. Participatory approaches in particular, which require actively involving local populations in the process, lead to a better integration of local or traditional knowledge.
Through these practices ─ particularly scenario planning and dialogue ─ it is possible to develop ethical thinking to support decision making, i.e. thinking that takes into account the underlying values that guide us when deciding if a signal has priority or not or if a risk is acceptable or not.
This article was originally published on The Conversation: https:https://theconversation.com/anticiper-les-crises-ces-filtres-qui-nuisent-a-lanalyse-des-signaux-faibles-137171