Fernando_Barrio
Fernando Barrio is co-director of the Centre for Environmental Change and Communities and principal lecturer in business and law at Queen Mary University of London. This essay is his written response in January 2026 to the question, “How might individuals and societies embrace, resist and/or struggle with transformative change in the AI Age? What cognitive, emotional, social and ethical capacities must we cultivate to ensure effective resilience? What actions must we take right now to reinforce human and systems resilience? What new vulnerabilities might arise and what new coping strategies are important to teach and nurture?” It was published in the 2026 research study “Building a Human Resilience Infrastructure for the AI Age.”

“Artificial intelligence is already becoming more consequential and less visible than just a year ago as the infrastructure through which institutions perceive reality and act upon it. AI is embedding itself in systems that allocate resources, assess risk, filter knowledge and coordinate action and as it does so, it increasingly disappears from view, not because it is insignificant, but because it has become part of the environment itself. Yet this environment will not be experienced in the same way everywhere. In the Global North, AI will often arrive as convenience, optimisation and support; in much of the Global South, it will arrive as condition, requirement and constraint that shapes access to services, work and mobility long before meaningful public debate takes place. For much of human history, resilience was understood as a personal capacity, the ability to endure uncertainty and recover from disruption. Yet AI does not simply introduce disruption; it reorganises it, moving uncertainty from visible human disagreement into opaque technical systems where power is exercised indirectly and responsibility is diffused and this shift is not neutral. Societies with strong institutions, regulatory capacity and social protections will be able to contest and shape AI systems, while those without them will experience automation as imposed, imported and difficult to refuse.

Resilience can no longer be defined only as emotional strength or cognitive flexibility, because the challenge is no longer simply how to cope with change, but how to retain agency when the systems producing change are designed elsewhere. Resilience must therefore become institutional, legal and collective, or it will remain fragile and deeply unequal.

“In this environment, resilience can no longer be defined only as emotional strength or cognitive flexibility, because the challenge is no longer simply how to cope with change, but how to retain agency when the systems producing change are designed elsewhere. Resilience must therefore become institutional, legal and collective, or it will remain fragile and deeply unequal.

“People will both embrace and resist this transformation, often at the same time.

“They will embrace AI because it reduces friction in daily life, because it writes and summarises, plans and predicts, advises and coordinates and because it fills gaps left by under-resourced institutions. In many parts of the world, AI will be adopted not because it is trusted, but because it is the only scalable option available. Yet people will also struggle, because these same systems quietly narrow the space for discretion, replacing judgment with defaults and deliberation with optimisation, so that life becomes easier to navigate but harder to contest. What is gained in efficiency may be lost in sovereignty, especially where systems are procured rather than co-designed. This tension will define the coming decade.

“Many people will adapt pragmatically, learning how to prompt systems, how to phrase appeals, how to align their behaviour with algorithmic expectations and how to live within infrastructures they do not fully understand. But this adaptation will look very different across regions. In wealthier societies, it may be framed as innovation; elsewhere, as survival. Yet in both cases, adaptation will often be closer to coping than to resilience, because resilience requires the ability to step outside a system, to question its premises and to refuse its outcomes when they are unjust. Without that capacity, adaptation becomes dependency and dependency becomes normality, particularly where alternatives do not exist.

If AI is treated as destiny, resilience will shrink. If it is treated as infrastructure, subject to democratic design, shared responsibility and global justice, resilience may yet expand, quietly and deliberately, into a form worthy of a world that is no longer evenly connected, but still collectively responsible.

“The capacities we must cultivate are therefore not only technical but civic. The practices that enable resilience are not technical add-ons but political commitments in support of human flourishing.

Cognitive resilience in an AI-saturated world does not mean learning how to use tools more efficiently, but understanding that AI outputs are probabilistic, contextual and shaped by embedded assumptions about value, risk and efficiency, assumptions that often reflect the priorities of those who build the systems rather than those who live under them. Education must therefore teach people not only how to work with AI, but how to interrogate it, how to localise it and how to challenge it, especially in contexts where AI is imported as infrastructure rather than developed as a public good. These are democratic skills and they are essential for technological self-determination.

Emotional resilience will also be tested, as AI accelerates change and destabilises long-standing ideas about expertise, creativity and work. In many economies, automation will intersect with informality, precarity and weak social protection, intensifying insecurity rather than alleviating it. Resilience here cannot be reduced to individual coping strategies or digital skills training; it requires social protection, labour transitions and public narratives of value that extend beyond productivity, because without these, AI will amplify existing vulnerabilities rather than mitigate them.

Social resilience will depend on whether AI is used to strengthen cooperation or to replace it. In regions where public institutions are fragile, people will increasingly turn to AI for guidance, support and sensemaking not because they prefer to, but because no human alternative is available. This may help individuals cope, but it risks deepening isolation and eroding trust if digital systems substitute for relationships rather than supporting them. Strong human institutions remain the foundation of resilience, even in highly digital societies and especially in those where technology arrives faster than governance.

Ethical resilience may be the most fragile of all, because AI systems reward speed, efficiency and compliance, while ethical action often requires hesitation, questioning and refusal. In asymmetric contexts where power is concentrated and accountability is weak, challenging automated decisions can carry real risk. Ethical resilience therefore cannot depend on individual courage alone; it must be protected through law, collective action and international norms that recognise the unequal distribution of technological power and the right of societies to refuse harmful automation.

If societies fail to act now, new vulnerabilities will harden quickly. Inequality will deepen as resilience becomes a privilege of those with education, connectivity and institutional voice. Cognitive dependency will grow as judgment is delegated by default to systems designed elsewhere. Democratic erosion will accelerate.

“Transparency must be a right rather than a feature and it must apply across borders. Contestability must be normal rather than exceptional and accessible even to those without technical expertise. Liability must be traceable rather than dissolved into global supply chains. Public institutions must have the capacity to audit, regulate and redesign digital infrastructure in the public interest and international cooperation must support that capacity rather than undermine it. Without these conditions, resilience will become a luxury, unevenly distributed along existing lines of wealth and power.

“If societies fail to act now, new vulnerabilities will harden quickly. Inequality will deepen as resilience becomes a privilege of those with education, connectivity and institutional voice. Cognitive dependency will grow as judgment is delegated by default to systems designed elsewhere. Democratic erosion will accelerate as automated systems quietly replace deliberation in domains that were once governed by politics. The most dangerous vulnerability, however, is normalisation, represented by the moment when societies accept that they have no choice, that systems cannot be questioned and that the future is something imported rather than shaped.

“Resilience in the age of AI is therefore not about becoming more-adaptable individuals, but about becoming more-demanding societies, capable of insisting that systems remain intelligible, contestable and aligned with local and global values. The future will not be decided by how intelligent our machines become, but by how seriously we take the task of governing them, teaching with them, and, when necessary, refusing them. If AI is treated as destiny, resilience will shrink. If it is treated as infrastructure, subject to democratic design, shared responsibility and global justice, resilience may yet expand, quietly and deliberately, into a form worthy of a world that is no longer evenly connected, but still collectively responsible.”


This essay was written in January 2026 in reply to the question: “AI systems are likely to begin to play a much more significant role in shaping our decisions, work and daily lives. How might individuals and societies embrace, resist and/or struggle with such transformative change? As opportunities and challenges arise due to the positive, neutral and negative ripple effects of digital change, what cognitive, emotional, social and ethical capacities must we cultivate to ensure effective resilience? What practices and resources will enable resilience? What actions must we take right now to reinforce human and systems resilience? What new vulnerabilities might arise and what new coping strategies are important to teach and nurture?” This and 200-plus additional essay responses are included in the 2026 report “Building a Human Resilience Infrastructure for the AI Age.”