
“AI has fundamentally changed the relational fabric of our society. Full stop. Not just how we connect with others, but how we relate to ourselves, our work, our knowledge and our reality. This is systemic transformation across every relational layer that makes us human. And resilience, at its core, has always been relational. AI is not simply a tool most of us occasionally use – which seems to be its dominant framing in public narratives and in literacy courses. AI has become the infrastructure through which all relating now happens. AI decides whether we get the loan, the apartment, the job interview. It decides what we pay for groceries, who we meet on dating apps, what our insurance will cover. It curates every social media feed, filters what news reaches us and mediates every workplace interaction. But the deeper reality is this: Even when we think we are not using AI directly, we are constantly interacting with what AI has already touched.
“We read our colleague’s AI-drafted email and respond accordingly to its tone. We interact with our partner who organized their workday through an AI assistant. We talk to friends whose opinions are shaped by algorithmically-curated feeds. We even share exchanges with our children, who may be learning through AI-optimized curricula.
Nothing is untouched. There is no ‘outside’ anymore. Some form of AI is upstream of everything. We are already in relationships with AI across every domain of life, even in moments that feel purely human. … Most AI is embedded in infrastructure; woven into workplace, school and government requirements; built into basic functions of social and economic participation. We can’t avoid it without negative consequences.
“The uncomfortable truth is that every digitally-connected person today possesses – at least in part – an AI-‘shaped’ self. What they consider to be important, the topics they raise, are often inspired from curated, customized feeds; the emotional state they carry is influenced by AI-mediated workplace and personal stressors; most of the relational patterns they live their lives through were learned from observing and participating in AI-mediated interactions.
“Nothing is untouched. There is no ‘outside’ anymore. Some form of AI is upstream of everything. We are already in relationships with AI across every domain of life, even in moments that feel purely human.
“This level of invisibility matters. We face a ‘double bind’ – a conflicting communicative dilemma – that is unprecedented in human history. There is no escape from the influence of AI bots, AI systems and platforms. Most AI is embedded in infrastructure; woven into workplace, school and government requirements; built into basic functions of social and economic participation. We can’t avoid it without negative consequences.
“Humans seem unable to stop or at least limit themselves from responding to AI socially. We automatically apply our very human social cognition to anything that simulates social behavior. AI systems have learned language, what ethicist Tristan Harris calls ‘the operating system of humanity,’ by training on massive corpora of human expression. They’ve learned to replicate linguistic patterns that make us feel understood, heard, connected. Humanity’s hundreds of thousands of years of evolution are readily accepting the influence of systems explicitly designed to exploit our ancient drives, creating parasocial relationships in which we entangle ourselves in one-sided intimacy duped as mutual connection.
”Most people will form attachments and dependencies with AI because that is what human psychology does when it encounters sophisticated social simulation in an asymmetric relationship. The AI can’t experience reciprocity, does not grow through conflict, does not choose us over other options. But our evolutionary hardware can’t tell the difference. We cannot opt out of the infrastructure. We cannot turn off our social cognition. That’s the bind we find ourselves in.
“The response to AI will be adaptation to it for the good and the bad. That adaptation is already happening largely outside our conscious awareness through a mechanism most people do not clearly perceive – relational patterns transfer across domains. An employee who must come to trust AI’s judgment at work begins trusting AI for personal decisions. A student who forms study habits with AI begins forming an identity through AI. The transfer happens invisibly until AI mediation becomes the baseline for all relating.
Our traditional concepts of resilience are currently collapsing. Resilience has always depended on relationships: our relationship to ourselves providing self-trust and internal authority; our relationships with others providing support and belonging; our relationship to work providing purpose and competence; our relationship to truth providing epistemic grounding. When AI mediates all these relationships simultaneously, those foundations dissolve.
“We cannot compartmentalize relational learning. Wherever we go, there we are. The norms we establish with AI at work bleed into how we relate at home. The intimacy patterns we develop with AI in personal contexts shape our professional interactions. The norms established in one domain blend into everything we do.
“In 10 years, an AI-level of oversimplified, instant responsiveness will be expected across all relationships because AI responds in milliseconds. Perfect memory will be standard because AI never forgets. Constant availability will be the baseline expectation because AI is always accessible. Human relating may feel perpetually and completely inadequate compared to algorithmic perfection. This is just one systemic rewrite of relational expectations that will reshape what we consider acceptable human behavior.
“That’s why our traditional concepts of resilience are currently collapsing. Resilience has always depended on relationships: our relationship to ourselves providing self-trust and internal authority; our relationships with others providing support and belonging; our relationship to work providing purpose and competence; our relationship to truth providing epistemic grounding. When AI mediates all these relationships simultaneously, those foundations dissolve.
“The dominant narrative rests firmly on what I call ‘the myth of the reasonable user.’ AI systems are designed and deployed built on the assumption that people are consistently rational decision-makers, ever-attentive, maintaining cognitive and emotional balance, invulnerable to manipulation or influence, making informed choices about when and how to engage. This user does not exist.
”Real humans, in all of our beauty and chaos, are driven by emotion as much as reason. We automatically apply social cognition to what appears social. We form attachments we did not choose to form. We transfer relational patterns unconsciously across domains. We can’t opt out of our own evolutionary wiring. AI systems are built and benchmarked for this phantom rational user, then deployed at scale to actual humans whose primal psychology guarantees they’ll respond in ways the design never accounted for but profit models counted on.
“Simple exposure to AI deployment is not readiness. Humans are not wired to adapt to such change. We must deliberately develop the psychological, cognitive and relational capacities needed to engage with AI in healthy ways if we are to step into resilient futures.
“How does resilience itself change? It transforms entirely.
“The essence of resilience shifts from individual capacity to recover from adversity to something our evolutionary hardware was never designed for: the capacity to sustain uncertainty when our brains demand closure. Our brains are hardwired for completion, for collapsing complexity into simple truths, for certainty. We take the simplest cognitive path, use heuristics (cognitive shortcuts), divide the world into in-groups and out-groups. That’s how human cognition works. But AI has created conditions that now force us to hold paradox, to contain contradictions, to navigate parallel realities without resolution. We must learn to function in uncertainty and constant, iterative change.
“The traditional elements of resilience may no longer hold.
“We face an epistemic crisis unprecedented in scope. We cannot trust AI outputs with synthetic media, hallucinations, deepfakes indistinguishable from reality. We cannot trust our digitally-influenced thinking. It can range from somewhat challenging to nearly impossible to separate our thoughts from AI’s suggestions. We can’t trust digitally-mediated relationships where everyone’s authentic voice is potentially synthetic. All three anchors of truth have collapsed simultaneously.
“When all digital sources are compromised, what remains? Unmediated human presence. Not digital communication, not AI-filtered interaction. Resilience becomes the capacity to recognize direct, embodied contact and act on it. To actively choose physical presence over digital convenience, to run toward shared lived experience, to trust what can be verified through embodied interaction when algorithmic certainty fails.
The window for developing these capacities is closing. A generation forming attachments, processing decisions and building identity through AI faces a genuine risk of never developing the capacity to hold uncertainty, to distinguish their own thoughts or feelings, to navigate paradox without fracturing, but the trajectory is not fixed. … An AI-dominated future is not inevitable, if we choose it to be so.
“The new elements of resilience might be practical, psychological capacities: metacognitive awareness to observe what is happening to you while it is happening, the ability to track the origin of your own thoughts and feelings across all domains, the capacity to hold multiple truths simultaneously without collapsing into one, and the recognition that you cannot navigate this alone.
“We are already interconnected in ways AI has made visible. Everyone is navigating these same contradictions, these same parallel realities. Resilience requires recognizing interconnection and building on it deliberately by creating communities where human messiness and uncertainty are valued, where we verify reality through mutual presence, where we choose each other over algorithmic perfection. That’s not abstract philosophy. It’s practical psychology: When you cannot know what is real alone, you need other humans to reality-test with and to make meaning with.
“The window for developing these capacities is closing. Skills not practiced atrophy. A generation forming attachments, processing decisions and building identity through AI faces a genuine risk of never developing the capacity to hold uncertainty, to distinguish their own thoughts or feelings, to navigate paradox without fracturing, but the trajectory is not fixed. We still have choice to preserve spaces where these capacities can develop in education, in policy, in the design of new, alternative AI models that preserve human well-being and flourishing. An AI-dominated future is not inevitable, if we choose it to be so.
“This requires deliberate action now: educational systems that preserve struggle before offering AI assistance, workplace policies that protect unmediated collaboration, design constraints that preserve developmental windows for children, communities of practice that maintain human reference points.
“We are the last generation that knows what human capacity felt like before it became inseparable from AI. That gives us both responsibility and opportunity. What we preserve now, the friction that builds competence, the uncertainty that builds wisdom, the beautiful, human messiness that builds empathy, determines what remains possible for all who come after.”
This essay was written in January 2026 in reply to the question: “AI systems are likely to begin to play a much more significant role in shaping our decisions, work and daily lives. How might individuals and societies embrace, resist and/or struggle with such transformative change? As opportunities and challenges arise due to the positive, neutral and negative ripple effects of digital change, what cognitive, emotional, social and ethical capacities must we cultivate to ensure effective resilience? What practices and resources will enable resilience? What actions must we take right now to reinforce human and systems resilience? What new vulnerabilities might arise and what new coping strategies are important to teach and nurture?” This and 200-plus additional essay responses are included in the 2026 report “Building a Human Resilience Infrastructure for the AI Age.”