
“I think of the coming 10 years as the ‘Cyborg Slide,’ a time when we will develop new abilities but at the cost of shedding parts of our humanity which we must work to hold onto. The quickest way to describe the biggest problem during this slide is that we will be increasingly invited to surpass the ‘slow and small’ conditions for human meaning as we have known it. Whether we want to retain access to friendship or forgiveness, justice or even jokes, we will need to resist the urge to always go bigger, move faster, live longer and prioritize quantity of conversation partners over meaningful relation. For centuries, our ‘stories of self’ and the meanings that such stories make possible, have been conditioned by a certain rich slowness and good smallness, even with the vast diversity of individual stories and even with all of the speed-increases, from horse to car to airplane.
The growing drift from human to cyborg signals a rewriting, not simply of our smartwatch styles but of our ‘story of self’ and the meanings that are allowed to circulate within the context of that story.If we allow ourselves to become cyborgs, can we tell inside jokes to close friends?
“Our human concept of friendship, for example, has quietly relied on certain ‘slow and small’ limits on the number of people we might expect to know and the number of years we might expect to live. The delicate act of say, an inside joke with a good friend is not just lost but impossible in a social media chat with millions of strangers because that is precisely not what ‘inside joke’ means.
“Same for forgiveness. If one doesn’t know anyone slowly enough to wound or be wounded, one loses access to the category of forgiveness, ‘forgiving’ and ‘not forgiving’ increasingly fail to hold meaning.
“The growing drift from human to cyborg signals a rewriting, not simply of our smartwatch styles but of our ‘story of self’ and the meanings that are allowed to circulate within the context of that story. If we allow ourselves to become cyborgs, can we tell inside jokes to close friends? Not in any current use of the term ‘inside joke’ or ‘close friend’ because as cyborgs we will have surpassed so many of the current ‘small and slow’ conditions of how we relate to limited time itself related to how we experience self, neighbors, pasts, futures, memories and hopes and all of that in relation to what ‘friendship’, ‘jokes’ and ‘inside jokes’ mean.
“To help embrace many AI advances while avoiding the Cyborg Slide and its resulting loss of access to cherished human experiences, here are some of the interrelated goals and strategies we must take up now:
1) “Help people talk more about the distinction between embracing many aspects of AI, while also ensuring AI does not inadvertently prevent people from accessing their favorite human experiences.
2) “Develop ways of talking about AI futures that neither demonize nor utopianize but rather cultivate in people a ‘pros and cons’ mindset when it comes to any AI enhancement: How will it make my life better? How will it quietly rob me of access to my most cherished human experiences?
3) “From the number of people we set out to know and the time-consuming process of building strong relationships, to the slow simmer of friendship and the intimate scale of forgiveness, help people understand how ‘slow and small’ parameters of human life enable some of our most cherished experiences.
4) “And – using the ‘pros and cons’ mindset – help people consider how even small AI disruptions of those parameters might risk robbing them of their most cherished experiences, whether (and if so why) they might be willing to take some of those risks but not others and whether there are or aren’t ways to take up particular pieces of AI technology so as to minimize its likelihood of robbing us of access to our favorite human experiences.
“All of this should be undertaken through writing, art, media and film, exposing people to more of these conversation frames, pro and con ideations and a growing number of concrete case studies in and explorations of the conditions for and textures of human experience that are most worth saving and most susceptible to interference in increasingly AI-saturated futures.”
This essay was written in January 2026 in reply to the question: “AI systems are likely to begin to play a much more significant role in shaping our decisions, work and daily lives. How might individuals and societies embrace, resist and/or struggle with such transformative change? As opportunities and challenges arise due to the positive, neutral and negative ripple effects of digital change, what cognitive, emotional, social and ethical capacities must we cultivate to ensure effective resilience? What practices and resources will enable resilience? What actions must we take right now to reinforce human and systems resilience? What new vulnerabilities might arise and what new coping strategies are important to teach and nurture?” This and 200-plus additional essay responses are included in the 2026 report “Building a Human Resilience Infrastructure for the AI Age.”