Esther Dyson
Esther Dyson is a serial investor-advisor-angel for tech startups and founder of Wellville (community well-being, 2015-2024). She is now working on a new book, “Term Limits: A Design for Living in the Age of AI.” This essay is her written response in January 2025 to the question, “How might expanding interactions between humans and AI affect what many people view today as ‘core human traits and behaviors’?” It was published in the 2025 research study “Being Human in 2035.

“The short answer is: The future depends on us. The slightly longer answer: The future depends on how we use AI and how well we equip the next generation to use it. I’d like to share here with you more specifics on this, excerpted from an essay I wrote for The Information:

“‘People worried about AI taking their jobs are competing with a myth. Instead, people should train themselves to be better humans.

AI can give individuals huge power and capacity that they can choose to use to empower others or to manipulate others. If we do it right, we will train children, all people, to be self-aware and to understand their own human motivations – most deeply, the need to be needed by other humans. They also need to understand the motivations of the people and the systems they interact with, many of which will be empowered and driven by AI that reflects the goals of the people and institutions and systems that control them. It’s as simple as that and as hard to accomplish as anything I can imagine.

  • We should automate routine tasks and use the money and time saved to allow humans to do more meaningful work, especially helping parents raise healthier, more engaged children.
  • We should know enough to manipulate ourselves and to resist manipulation by others.
  • ‘Front-line trainers are crucial to raising healthy, resilient, curious children who will grow into adults capable of loving others and overcoming challenges. There’s no formal curriculum for front-line trainers. Rather, it’s about training kids and the parents who raise them to do two fundamental things.
    • ‘Ensure that they develop the emotional security to think long-term rather than grasp at short-term solutions through drugs, food, social media, gambling or other harmful palliatives. (Perhaps the best working definition of addiction is “doing something now for short-term relief that you know you will regret later.”)
    • ‘Kids need to understand themselves and understand the motivations of the people, institutions and social media they interact with. That’s how to combat fake news or the distrust of real news. It is less about traditional media literacy and more about understanding: “Why am I seeing this news? Are they trying to get me angry or just using me to sell ads?” …

“‘Expecting and new parents are the ideal place to begin such training. They are generally eager for help and guidance, which used to come from their own parents and relatives, from schools and from religious leaders. Now such guidance is scarce.’ (End of excerpt)

“AI can give individuals huge power and capacity that they can choose to use to empower others or to manipulate others. If we do it right, we will train children, all people, to be self-aware and to understand their own human motivations – most deeply, the need to be needed by other humans.

“They also need to understand the motivations of the people and the systems they interact with, many of which will be empowered and driven by AI that reflects the goals of the people and institutions and systems that control them. It’s as simple as that and as hard to accomplish as anything I can imagine.”


This essay was written in January 2025 in reply to the question: Over the next decade, what is likely to be the impact of AI advances on the experience of being human? How might the expanding interactions between humans and AI affect what many people view today as ‘core human traits and behaviors’? This and nearly 200 additional essay responses are included in the 2025 report Being Human in 2035.