Alison_Poltock
Alison Poltock is co-founder of AI Commons UK and The Heart of AI community interest groups and author of a Substack titled “The Future is Personal.” This essay is her written response in January 2026 to the question, “How might individuals and societies embrace, resist and/or struggle with transformative change in the AI Age? What cognitive, emotional, social and ethical capacities must we cultivate to ensure effective resilience? What actions must we take right now to reinforce human and systems resilience? What new vulnerabilities might arise and what new coping strategies are important to teach and nurture?” It was published in the 2026 research study “Building a Human Resilience Infrastructure for the AI Age.”

“Resilience in the age of AI will not come from technical mastery, but clarity. Clarity in our ability to stay human under systemic pressure. Clarity about boundaries between self, systems and automation. Clarity about where human responsibility begins and where machine logic must end. As artificial intelligence becomes embedded in our everyday lives, public systems and personal decision-making, the question is no longer whether AI will change society, but how quickly and with what oversight. Most public discourse remains preoccupied with the price of AI development (in dollars and environmental terms), job loss, bias, the erosion of privacy. All serious concerns. But the deeper structural risk is the erosion of the communal coordinates that anchor our shared truths and shape the conditions under which human identity, judgment and meaning are formed.

We are in a moment of epistemic shift. … The developmental frameworks shaping identity, agency and social orientation are shifting. … This is the terrain of vulnerability. Yet there is no shared conversation. No civic space where this new reality is named, let alone addressed. We are operating on outdated institutional architecture, strapping jetpacks to systems built for another age and allowing our children to grow up in the gap.

“We now allow AI-driven systems to ‘optimise’ our words, our work, our sleep, our moods. Students can automate away the struggle to find their unique voice. Policymakers can lean on predictive tools without understanding the assumptions beneath. A friend can engineer the perfect condolence without the need for any inconvenient feelings.

“AI is helping us cut corners. It is saving us a lot of time. But as we outsource our memory, language and creativity, we risk outsourcing our core human instincts as well. This loss isn’t registered at the level of headlines; it accumulates through habit. Over time, our muscle of introspection weakens, moral reasoning thins and the space for ambiguity and uncertainty – the playground of human insight – shrinks. It’s a quiet exit. There are no alarm bells. No spectacle. Just a lot less skin.

“We are in a moment of epistemic shift. Surveys in the past few years indicate that many people may be spending more time using AI-based platforms to be informed, discuss issues and share their lives than in participating in real-world, face-to-face social interaction. These are not marginal trends. They reveal that the developmental frameworks shaping identity, agency and social orientation are shifting. This is the terrain of vulnerability. Yet there is no shared conversation. No civic space where this new reality is named, let alone addressed. We are operating on outdated institutional architecture, strapping jetpacks to systems built for another age and allowing our children to grow up in the gap.

“AI systems are not just tools. They are parasitic by design. To reflect our voices, values, needs, they must be trained on our data, our habits, words and fears. This isn’t a side effect; it’s the core architecture. If we want AI to be of use to us, the system must first extract from us. Resilience begins by recognising that trade-off and deciding what must not be given away. What AI returns is not neutral. In maximising engagement, it slices up the digital world into private, personalised feeds. We lose the shared reference points that allow us to think, argue and act together. The digital Commons is not just shrinking, it’s being atomised. AI thrives on fragmentation. Democracy does not.

We stand at the edge of a profound transition, not just in what AI can do, but in what it reveals. Resilience will not come from adapting faster to machine systems. It will come from reorienting ourselves in relation to them. Now. … We need new infrastructures – educational, institutional, cultural – capable of holding this moment with care and foresight. We need systems that will protect human agency, not automate it.

“Resilience, then, cannot be reduced to personal ’grit’ or mindfulness. It must be treated as a civic design imperative and built into the systems and cultures that shape public life. That means:

1) “Structural Boundaries: Some decisions must remain human by design. Life, death, identity, rights and justice are not engineering problems. Governance must begin with red lines, backed by law, that guarantee human judgment in critical domains.

2) “Institutional Accountability: Any AI used in public life must be intelligible and open to scrutiny. Its function, data and outcomes must be visible to those it affects, with clear mechanisms for challenge and redress. A society cannot remain democratic if its citizens cannot audit the systems influencing them.

3) “Public Naming: We cannot govern what we cannot describe. Today’s AI terminology is fragmented – drawn from neuroscience, engineering, psychology and myth. But how we name systems shapes how we relate to them. AI systems must have an understandable, shared civic vocabulary or collective governance fails.

4) “AI Literacy: Using AI isn’t enough. Citizens must understand how systems are built, what trains them and where they fail. We need tools to interrogate outputs, decode assumptions and challenge influence. Interpretive literacy must be a civic right, requirement and governance priority.

5) “Cultural Safeguards: Resilience requires full human presence – not just ‘human in the loop,’ but human at the centre. Care, teaching, listening and community work are civic infrastructure. These roles carry our values and must be funded, protected and prioritised.

6) “Human-Centered Measurement: Public systems must resist valuing only what machines do well – speed, scale, efficiency. If those are our benchmarks, people will always fall short. We need metrics that honour trust, care, judgment, attention and social contribution. What we choose to measure defines what we choose to protect.

7) “Rights of Inclusion: Inclusion must mean real choice. No one should be forced into participation through convenience or excluded through design. Everyone must retain the right to remain untracked, unprocessed and private by default; true inclusion includes the right not to be included.

8) “Upstream Consultation: Consultation must shift away from reaction to design. Communities must be involved before systems are deployed, not after harm occurs. Resilience depends on participation, foresight, and consent at the point of creation.

“When the camera first appeared nearly 200 years ago, the painter J.M.W. Turner declared it ‘the end of art.’ But it wasn’t. It was the end of one kind of art: art as record. Freed from documentation, artists were liberated to reimagine the world. We are at a similar threshold. We stand at the edge of a profound transition, not just in what AI can do, but in what it reveals. Resilience will not come from adapting faster to machine systems. It will come from reorienting ourselves in relation to them. Now.

“We need new infrastructures – educational, institutional, cultural – capable of holding this moment with care and foresight. We need systems that will protect human agency, not automate it. We need public conversations grounded in ethics, not just outputs. And we need governance that treats this not as a policy issue, but as the civilisational inflection point it is.”


This essay was written in January 2026 in reply to the question: “AI systems are likely to begin to play a much more significant role in shaping our decisions, work and daily lives. How might individuals and societies embrace, resist and/or struggle with such transformative change? As opportunities and challenges arise due to the positive, neutral and negative ripple effects of digital change, what cognitive, emotional, social and ethical capacities must we cultivate to ensure effective resilience? What practices and resources will enable resilience? What actions must we take right now to reinforce human and systems resilience? What new vulnerabilities might arise and what new coping strategies are important to teach and nurture?” This and 200-plus additional essay responses are included in the 2026 report “Building a Human Resilience Infrastructure for the AI Age.”