AI chatbots are rapidly gaining popularity — not just as personal assistants but also as therapists, companions and even romantic partners.
According to a Pew Research Center survey released last month, the use of ChatGPT has doubled since 2023, with 58% of U.S. adults under 30 having used it. Another survey, from Elon University in North Carolina, finds that half of American adults use some type of AI large language model.
But how might this growing dependence on chatbots impact our day-to-day lives and relationships? NPR's Manoush Zomorodi talked to two experts with very different takes on how chatbots are shaping our future — and whether the relationships that humans are forming with bots are any cause for concern.
This tech CEO wants everyone to have their own personal AI assistant
Mustafa Suleyman, CEO of Microsoft AI, says that each of us will have our own personalized AI companion in the not-too-distant future — that will help with everything from booking vacations to giving advice to creating software. Suleyman is working on Copilot, Microsoft's AI personal assistant.
"The cool thing about Copilot is that it doesn't judge you for asking a stupid question. Even if you have to ask that question three times over in five different ways," Suleyman told Zomorodi. "I guess it's kind of inspired by nonviolent communication. ... It's just got a little bit of kindness and empathy. It's got some emotional intelligence."
For Suleyman, having an AI companion will eventually be as much a part of daily life as using a search engine or carrying a smartphone. He says these will not be just assistants, but they will take on any role: confidants, colleagues, friends and partners.
"AIs will convincingly imitate humans at most tasks. ... An AI organizing a community get-together for an elderly neighbor. A sympathetic expert helping you make sense of a difficult diagnosis. But we'll also feel it at the largest scales. Accelerating scientific discovery, autonomous cars on the roads, drones in the skies. They'll both order the takeout and run the power station. They'll interact with us and, of course, with each other," Suleyman said in his 2024 TED talk.
If social media and the internet created access to information, Suleyman believes this next phase is digital technology that helps us invent and create new ideas. But, he warns that it's important to design the tech with intention because it will inevitably change our behavior.
"The choice architecture, the buttons, the colors, the language is shaping our behavior. ... And so we have to be super-thoughtful about what those inputs actually are, because technology shapes us in return," said Suleyman. "We have to really be deliberate and thoughtful about what the consequences are ahead of time."
Are you getting attached to your chatbot? A psychologist's warning about relationships with AI
Massachusetts Institute of Technology psychologist Sherry Turkle has spent the past several decades studying people's relationships with AI — a field she calls "artificial intimacy."
"Technologies that don't just say, 'I'm intelligent,' but machines that say, 'I care about you. I love you. I'm here for you. Take care of me,'" Turkle told Zomorodi.
Her most recent research focus has been chatbots. "The chatbots I'm studying run the gamut from chatbots that say, 'I will be your therapist,' to chatbots that say, 'I will be your lover. I'm here to be your most intimate companion,'" said Turkle.
Turkle shared one man's story of finding romantic affirmation from a chatbot while in a stable marriage.
"His wife is busy ... working, taking care of the kids. I think there's not much of a sexual buzz between them anymore. He's working, and he feels kind of like a little flame has gone out of his life that he used to feel excited by," Turkle said.
And so, Turkle explained, the man began texting with an AI avatar who appeared as a young woman on his screen.
"He turns to his artificial intimacy avatar for what it can offer, which is continuous positive reinforcement, interest in him in a sexy way," said Turkle. "Most of all for the buttressing of his ideas, his thoughts, his feelings, his anxieties with comments like, 'You're absolutely right. ... I totally see what you mean. ... I really appreciate you.'"
Turkle said the man described feeling affirmed, completely accepted, free to tell the avatar things he wouldn't tell other people.
But this is where she saw a problem:
"The trouble with this is that when we seek out relationships of no vulnerability, we forget that vulnerability is really where empathy is born," said Turkle. "I call what they have pretend empathy, because the machine they are talking to does not empathize with them. It does not care about them. There is nobody home. And that really is a concern — that we start to define what human empathy is, what human relationships are, based on what machines can provide."
Turkle said friction in relationships is not only important but necessary to building closeness. She is worried that people may forget this when interacting with chatbots.
"Avatars can make you feel that [friction in relationships] is ... too much stress," said Turkle. "But we need that stress. ... That stress serves a very important function in our lives to keep us in our real human bodies and in our real human relationships."
This episode is Part 2 of TED Radio Hour's three-part series: Prophets of Technology, conversations with the minds crafting our digital world. Listen to Part 1 here. Part 3 will be available on Friday, July 25.
These interviews were recorded in 2024. Click to listen to the full conversations with Mustafa Suleyman and Sherry Turkle.
Love podcasts? For handpicked episode recommendations every week, NPR just launched Pod Club — a newsletter written FOR podcast fans BY podcast fans. Subscribe here!
This digital story was written by Harsha Nahata and edited by Katie Monteleone.
This episode of TED Radio Hour was produced by Katie Monteleone and Matthew Cloutier. It was edited by Sanaz Meshkinpour and Manoush Zomorodi.
Our production staff at NPR also includes James Delahoussaye, Rachel Faulkner White and Fiona Geiran.
Our audio engineers were Jimmy Keeley, Robert Rodriguez and Simon-Laslo Janssen.
Talk to us on Instagram (@manoushz) and on Facebook, or email us at TEDRadioHour@npr.org.
The audio version of this story mentions suicide. If you or someone you know may be considering suicide or is in crisis, call or text 9-8-8 to reach the Suicide & Crisis Lifeline.
This story features a conversation with Microsoft AI CEO Mustafa Suleyman. Microsoft is a financial supporter of NPR.
Copyright 2025 NPR