The Bank Run That Wasn’t: How epistemic failures can lead to national crises, and what we can do about it

Published:

Breaking: Queues are forming outside cash machines across the UK this morning as banking apps slowed and failed amid mounting reports that major financial institutions are on the brink of collapse. Several banks have been forced to restrict transfers, and we understand the Treasury is currently engaged in emergency talks as UK markets crash. It is unclear what sparked this panic, but BBC investigators warn that false information and deep fake reporting are surging online. Please seek further information responsibly.

This scenario is not real. Neither is it fiction. It is one of several hypothetical futures explored in depth in Epistemic Security for Crisis Resilience, a new report from Demos, developed in partnership with the Centre for Emerging Technology and Security (CETaS) at the Alan Turing Institute.

Hypothetical Crisis Scenarios:

  • Xenophobic violence leads to breakdown in trust
    A far-right chemical attack followed by deepfakes and coordinated disinformation falsely blames a minority community, triggering violence and a collapse in trust in public authorities.
  • AI-driven breakdown of the legal system
    Pervasive generative AI and digital vulnerabilities corrupt legal records and evidence at scale, collapsing accountability, destroying trust in justice, and enabling crimes to go unpunished.
  • Bank run and economic collapse
    State-backed disinformation, deepfakes and small-scale financial sabotage erode confidence in banks, triggering mass panic and systemic economic collapse within 72 hours.
  • Foreign tech superpower undermines UK sovereignty
    Control over global digital infrastructure is weaponised by a foreign power, crippling UK businesses and essential services and turning technology dependence into geopolitical leverage.


These scenarios were designed by experts to examine how threats to the UK’s epistemic systems — the process by which information is produced, disseminated, consumed, and trusted — increases the UK’s vulnerability to crises, both by increasing the risk of crises developing and hindering societies ability to respond.

While each scenario is hypothetical, they are plausible futures built up from real conditions and influencing factors: technological developments, institutional dynamics, social and economic conditions, legal and regulatory context, and geopolitical events.

Earlier work by the authors using similar methods in 2018 proved highly predictive. The work forecasted the Southport Riots and anticipated the crisis of communication and public distrust surrounding the COVID-19 pandemic response.

This report is a second iteration updating for current context and looking towards interventions.

Our new research shows that the UK’s epistemic environment is highly vulnerable: exposed to a dense web of interconnected risks spanning mass digitisation, platform dependence, foreign information operations, cyber insecurity, the growing scale and sophistication of AI‑enabled manipulation, and disintegrating news ecosystems (Figure below). Taken together, these pressures create a fragile information ecosystem in which crises are triggered more easily and escalate more quickly, and in which crisis response is increasingly difficult to coordinate, thwarted by intensifying division and distrust.

Crisis scenario map: Bank run & economic crash 

Bolstering the UK’s epistemic resilience must be treated as a serious matter of national security. But there is no silver bullet. Focusing narrowly on individual symptoms like pulling government comms off X (however satisfying that feels) must not distract from or delay efforts to do the hard work of tackling the more complex and systemic vulnerabilities in our epistemic systems. In an environment suffering ‘death by a thousand cuts’, resilience will require sustained, coordinated action across multiple fronts.

Epistemic security is a hydra’s nest of distributed and complex challenges, but it is not intractable.   

During the workshops, experts analysed four unique crisis scenarios featuring different actors, influencing factors, crisis types and mechanisms. Despite these differences, intervention analysis identified seven cross-cutting intervention areas common across all four scenarios. 

We identify these as the most important leverage points for action; efforts in these areas will go furthest toward improving societal resilience to crisis and complex challenges across the board.  

Cross-cutting intervention areas:

 (in no particular order):

  1. Content and data provenance: including digital signatures and watermarking, to help establish where information comes from and whether it has been manipulated.
  2. Media and information literacy: ensuring citizens are better equipped to navigate complex and contested information environments.
  3. Cybersecurity and digital infrastructure: recognising that epistemic resilience depends on the robustness of the systems that carry information.
  4. Crisis protocols for online platforms: so that companies have clear expectations and responsibilities when information threats escalate rapidly.
  5. Government and regulatory preparedness: updating civil contingency planning to reflect epistemic risks rather than purely physical ones.
  6. AI risk management: addressing how advanced AI systems can distort knowledge production at scale if left unchecked.
  7. Revitalising local and regional news ecosystems: which play a critical role in anchoring trust and shared understanding during crises.

Together, these seven areas form a practical framework – a roadmap for policymakers on where to allocate limited resources in order to strengthen the UK’s epistemic foundations. They also shift the focus from reactive firefighting to structural resilience.

The opening bank run story isn’t real (yet). But the vulnerabilities it and the other scenarios exposed are very much with us. We must recognise our society’s epistemic systems as critical infrastructure and act early to strengthen them.