Cognitive bias in crisis decision-making
A full U.K. public inquiry into the coronavirus pandemic has been deferred until 2022. The outcome of this inquiry will include a list of lessons to be learned in preparedness for the next pandemic. But COVID-19 continues to pose a significant UK public health risk, requiring further difficult control decisions to be made. Accordingly, on 12 October 2021, a House of Commons report on coronavirus lessons learned to date was issued. With groupthink and other cognitive biases highlighted, an important lesson needs to be learned about cognitive bias in making crisis decisions.
Beliefs and bias
During the pandemic, scientists, physicians, and politicians formed beliefs about COVID-19 without formally acknowledging that these beliefs might have been prone to aspects of cognitive bias. The Economics Nobel Laureate, Daniel Kahneman, has noted that the confidence people have in their beliefs is not a measure of the quality of evidence, but of the coherence of the story the mind has managed to construct. Accordingly, any major review of key decisions made during the pandemic can provide insight into the intrusion of cognitive bias. Such a review is the House of Commons coronavirus report on lessons learned to date.
The House of Commons Report
The House of Commons inquiry was established in October 2020 with the aim of providing a fuller evaluation of the Government’s handling of the pandemic. The purpose was not to apportion blame, but to seek to provide an early assessment of the key decisions, structures and underlying factors which contributed to the extent of the pandemic’s impact in the U.K.
The resulting report from the Health and Social Care and Science and Technology Committees was made publicly available on 12 October 2021.
Examples of cognitive bias
On the publication date, the current Chief Scientific Adviser, Sir Patrick Vallance, clarified for the first time during the pandemic that ‘science informs, but does not lead the way in public policy’. In a radio interview with the popular science presenter, Prof. Jim Al-Khalili, he stated that science is ‘actually about uncertainty: uncertainty is part of the progress in science’. This is especially true of pandemic science. A century after the 1918 pandemic, the geographical source is still uncertain – as is the source of SARS-CoV-2.
The House of Commons report reveals that Ministers and other advisers felt it difficult to challenge the views of their official scientific advisers. The intrinsic uncertainty in scientific knowledge might have been more candidly admitted, along with the lack of scientific training within Whitehall. The Chief Scientific Adviser was dismayed that only 10% of the Civil Service Fast Stream had a science or engineering degree. Such graduates would have been aware of probability, the formal language of uncertainty. In Singapore, not only does the Prime Minister’s Office recruit highly numerate staff, but Prime Minister Lee himself was a top Cambridge mathematics graduate.
Where scientific evidence is unclear and uncertain, early decisions can still be made by political leaders on a precautionary basis, as has happened successfully in a number of East Asian countries, including Singapore. The House of Commons report praised East Asian countries for the rigorous approach adopted to stopping the spread of the coronavirus.
At many a Downing Street press briefing, the mantra of ‘following the science’ has been a standard defence of action – or inaction. In a crisis, decision-makers should follow sound principles of risk management. Waiting for a high degree of scientific certainty is not appropriate in urgent crisis decision-making. Prof. Neil Ferguson told the Science and Technology Committee in June 2020 that if the national lockdown had been instituted even a week earlier, the death toll would have been reduced by at least a half. This counterfactual does not require elaborate mathematical modelling, which is criticized in the House of Commons report; it is a simple consequence of the rapid exponential spread of a novel infection.
Contemplating precautionary action ahead of the gathering of observed data does not come easily to those scientifically trained. Prof. Chris Whitty, the Chief Medical Officer for England, told the Science and Technology Committee in November 2020 that he preferred advice to be given on the basis of observed data. This is a perfectly sound and justifiable preference, yet clearly inappropriate in an emergency without time for ideal data gathering. The opposite approach was seen in Singapore: rather than being discouraged, face masks became mandatory in April 2020, as a common sense courtesy to others, well before the accrual of clear unequivocal scientific evidence of their value.
Another subject on which scientific evidence took months to accrue was asymptomatic transmission. A SAGE meeting on 20 January 2020 noted limited evidence of asymptomatic transmission. An early case study from Wuhan, China, circulated as a scientific preprint. This partial evidence had important implications for transmission from those untested. The Secretary of State for Health has regretted not saying that we should have proceeded on the basis that there was asymptomatic transmission until we knew that there was not, rather than the other way around.
In pandemic planning, the former chief Medical Officer for England, Dame Sally Davies told the House of Commons inquiry that they were in ‘groupthink’. Infectious disease experts did not believe that SARS, or another SARS, would get from Asia to the UK. She expressed the need to open up and get more challenge into thinking about what we are planning for. She added that it would be worth bringing in people from Asia and Africa to broaden our pool of experience. The House of Commons report echoed this in concluding that the composition of SAGE suffered from a lack of representation from outside the UK.
In all risk assessment, cognitive bias is a potentially dangerous pitfall. Other than groupthink, experts are prone to availability bias (reliance on what immediately comes to mind, e.g. SARS), and optimism bias (expecting a positive outcome, e.g. limited transmission). Cognitive bias is not restricted to any profession, and is prevalent amongst finance professionals.
Cognitive bias has become a topic of increasing actuarial focus, and a general insurance actuarial working group on this, led by the author, was established in 2020 as one of the IFoA’s COVID-19 action taskforce workstreams. In the context of the pandemic, those who are vaccine hesitant may be prone to confirmation bias when hearing of any adverse reaction to vaccination.
A decade ago, I served on the Blackett Committee chaired by the Chief Scientific Adviser, Sir John Beddington, to address extreme U.K. risks (a broad range of events, not just pandemics). During his term of office, from 2008 to 2013, Sir John Beddington had to deal with a number of unprecedented disasters. These included the eruption of the Icelandic volcano Eyjafjallajökull in 2010, which shut down European airspace, and the great Japanese earthquake and tsunami of 2011, which caused serious damage to the Fukushima nuclear power plant.
In advising the Prime Minister on the re-opening of U.K. airspace in 2010, or evacuating U.K. citizens from Tokyo in 2011, the Chief Scientific Adviser might well have written:
In the early days of a crisis, scientific advice may be necessarily uncertain: data may be unavailable, knowledge limited and time may be required for analysis to be conducted. In these circumstances, it may be appropriate to act quickly, on a precautionary basis, rather than wait for more scientific certainty.
In fact, this was the first of the recommendations and lessons learned in the report.
There are numerous lessons to be learned from the pandemic about virology, epidemiology and public health care. Yet, one of the most important lessons to be learned from the pandemic is recognition of cognitive bias in making crisis decisions under uncertainty.