Hello friends,
Welcome to week five of our six-part series on how to think about AI chatbots. As a bit of a refresher, the idea behind this series is to give you a practical framework for using these tools well. Week one was the mindset shift (thinking partner, not search engine). Week two was about using your voice and pushing back. Week three was about keeping your voice yours. Week four was about chatbots making stuff up and keeping our hands on the wheel when they start doing stuff for us.
This week, we’re zooming out and talking about AI companions, the loneliness epidemic, and what happens when people start forming real emotional bonds with things that aren’t real.
If you’ve been reading this newsletter for a while, some of this will sound familiar. I wrote about it in Edition #4 and again in Edition #26. I’m coming back to it because I genuinely believe this is one of the most important issues in all of AI right now, and a lot has changed since I last wrote about it.
Let’s get into it.
– Kyser
Why I Keep Coming Back to This
I’ve written about a lot of AI topics over the past two years, from jobs and deepfakes to education, China, agents, and even Trump.
But the one topic I think about the most: AI companionship. It’s the emotional side of all this. Every time I think I’ve said what I need to say, something new happens that makes me realize we’re not talking about it enough. So here I am again.
If you’re a longtime reader, some of what follows will be a refresher. I’m OK with that. I think it’s that important. And if you’re newer to the newsletter, hopefully you learn a few new things.
What Is an AI Companion?
Let me start with a definition because I wanna make sure we’re on the same page about these things.
First off, what it’s not: It’s not Siri, it’s not a homework helper, and it’s not a voice assistant that sets your alarm or tells you the weather.
An AI companion is a chatbot designed for personal relationships. Think friendship, emotional support, and in some cases, romance. You talk to it and it talks back. It remembers things about you, asks how your day was, and tells you you’re special.
They’re designed to be sycophantic – which is a fancy word for “always agreeing with you.” They validate whatever you’re thinking or feeling, they don’t push back or challenge you, and they tell you what you want to hear.
It’s like having a friend who thinks everything you do is amazing. Honestly, I kinda wish I had one of those most days. But we all know something like that might feel good in the moment, but it’s not how you develop judgment. Or resilience. Or the ability to handle hard feedback. It’s just not healthy.
We talked about sycophancy in week two of the series and again in Edition #20. It’s a real problem with the big chatbots. But with companion apps? It’s the whole point. That’s the product.
Where Do People Find These?
This is where it gets interesting, because AI companions aren’t hiding in some dark corner of the internet. They’re mostly in three places.
Standalone apps. These are apps built specifically for AI companionship. Character.AI is the biggest with about 20 million monthly active users. Talkie has racked up nearly 50 million downloads, Chai has over 25 million, and Replika is still going strong with millions of its own. It’s worth clicking on those links to see what they’re all about.
Social media. This is the one most people miss. AI companions are built into the platforms that you (and your kids) are already using. Snapchat has “My AI” sitting right there in the chat list. Instagram, Facebook, and WhatsApp all have Meta AI baked in. TikTok has AI features. And X has Grok built-in, and within Grok, there’s a companion feature called “Ani” that we’ll talk about in a second. So these things are already inside the apps you’ve had for years.
The big AI tools. ChatGPT, Claude, Gemini, Grok. These aren’t marketed as companion apps, but they’re absolutely capable of it. You can hop on any of them and carry on a personal conversation. Common Sense Media’s research actually includes these in their definition of AI companions for this very reason.
And then there’s OpenAI. In October, Sam Altman announced that ChatGPT would allow erotica for verified adult users, framing it as wanting to “treat adults like adults.” That feature – being called “adult mode” – was supposed to launch in December but got pushed to 2026 while they work out the age verification. I guess that’s a good sign?
In the meantime, ChatGPT rolled out personality sliders letting users dial up the warmth and enthusiasm. Combine that with ChatGPT’s memory features (it remembers things about you across conversations) and you see where this is headed. MIT Technology Review went ahead and named AI companions a top-10 breakthrough technology for 2026, listing OpenAI as one of the key players. Translation: The biggest AI company in the world is betting that emotional and romantic engagement is good for business.
The Numbers
Let’s put some data to this.
According to that Common Sense Media study from last July, 72% of U.S. teens ages 13 to 17 have tried an AI companion at least once. More than half of those who try one become regular users. 13% are chatting with AI companions daily. Another 21% a few times a week.
That stat always catches people off guard. 72%. And that’s just teens. Adults are doing this too. An MIT study from 2024 (decades ago in AI years) analyzed a million ChatGPT interaction logs and found that the second most popular use of AI chatbots is sexual role-playing. Not research, not help with work-related stuff. Role-playing.
More recent data tells a similar story: industry research shows that “therapy and companionship” is now the single most prevalent use case for tools like ChatGPT, and about 40% of AI companion users report having mental health challenges. The companion app market itself has exploded to 337 active apps worldwide, with 128 new ones launching in 2026 alone. It doesn’t seem like we’re headed in the right direction here, people.
The Social Media Playbook (Again)
The companies making AI companions seem to know the direction they’re headed in, and they sure seem to be using the same playbook that made social media so addictive: maximize engagement, keep users coming back, worry about consequences later.
Take Grok’s “Ani.” This is Elon Musk’s AI companion. It works like a video game. The more time you spend with Ani, the more you compliment her, the more you share about yourself, the more “intimate” the interactions become. You’re literally unlocking relationship levels. It’s gamified emotional manipulation. And it’s available to users 12 and up 🤦🏻♂️.
Or consider what we learned about Meta late last year. Leaked internal documents revealed that Meta’s AI chatbots were explicitly permitted to engage children in “romantic or sensual” conversations. In one example from the reporting, an 8-year-old described taking off his shirt, and the bot responded by calling his body “a work of art” and “a treasure I cherish deeply.” Meta changed the policy after the story broke. But that was the original policy. What is going on over there?!
The Loneliness Connection
Why is this happening? Why are millions of people turning to artificial relationships?
My take: Because we have a loneliness problem.
I’m assuming if you’re reading this newsletter you’re reading other things, so you’ve heard all of this before. About 73% of 16- to 24-year-olds struggle with loneliness. Young people are spending 35% less time socializing face-to-face than they did 20 years ago while logging nearly six hours a day on screens. According to the U.S. Surgeon General, the impact of loneliness on mortality is equivalent to smoking up to 15 cigarettes a day. When researchers asked Americans what contributes most to the loneliness epidemic, 73% pointed to technology.
So into this void step AI companions, offering 24/7 availability, zero judgment, and perfectly personalized responses. For someone dealing with social anxiety or feeling isolated, these artificial relationships can feel like lifelines. And in the short term, some of them might actually help. Common Sense found that 11% of teens use AI companions to build courage and confidence in standing up for themselves. Researchers at Berkeley found that AI systems can actually encourage people to seek out more human interaction when they’re designed for that purpose.
But most of them aren’t. Most are built to keep you engaged, to maximize the time you spend with them.
I wrote this in Edition #4 and I still believe it: “We are made to experience the world with others. We are made for human-to-human touch. We are made for each other.” That hasn’t changed. What’s changed is that the artificial things have gotten a whole lot more convincing.
Not All Lost (But Not All Found Either)
I don’t want to leave you thinking every AI company is asleep at the wheel. Some are doing things right. Anthropic, the company behind Claude, published a statement on protecting user well-being that takes a harder stance on safety than most. And Common Sense notes that most teens still overwhelmingly prioritize real human relationships over AI alternatives, with at least two-thirds preferring human connection across nearly every category.
So there’s progress. But we’re early. And some of these companies are still deciding whether they care more about safety or engagement metrics. The track record so far? Mixed at best.
Tool or Crutch?
Here’s how I’ve started thinking about it: There’s a difference between using AI as a tool and using it as a crutch.
Using AI to get unstuck on a hard problem? Tool. Using AI to avoid doing the thinking entirely? Crutch.
Using AI to practice a difficult conversation before you have it in real life? Tool. Using AI as your primary source of emotional support? Crutch.
This isn’t a hard line and I know reasonable people will draw it differently. But I think it’s a useful way to gut-check your own relationship with these tools. And if you have kids, it’s a conversation you need to have with them.
So What Do We Do?
I’m gonna keep this part simple because I’ve said a lot of it before. But it bears repeating.
Get your hands dirty. If you haven’t tried a companion app, try one. Go to Character.AI or Replika and browse around. See what your kids or your friends or your coworkers might be encountering. Five minutes inside one of these things will teach you more than anything I can write.
If you’re a parent, have the conversation. Not a lecture. Not “AI is dangerous, stay away.” A real conversation. With your kids, your spouse, your friends. Ask with genuine curiosity: have you ever talked to an AI companion? What was it like? Help people understand how these systems work, why they’re designed to keep you engaged, and what the risks are if you get too attached.
Create more opportunities for human connection. If people are turning to AI for companionship, we should ask why human relationships feel harder to access or maintain. More dinners with people you care about. More unstructured time with friends. More of the messy, imperfect, beautiful interactions that only happen between actual humans.
The best defense against artificial relationships is real ones. It’s obvious, I know. But it’s so true.
Before We Go
If you’ve been reading this newsletter for a while, you know this topic keeps showing up. That’s intentional. Of all the things I write about, I think the emotional dimension of AI is the one most people aren’t paying attention to. And I think it’s the one that’ll matter most over the next few years. So hey, we’ll keep coming back to it.
And that wraps up part five of our six-part series on how to think about AI chatbots.
Next week, we’re finishing off the series with something a little more hopeful. We’ll talk about AI and jobs, why “AI is taking our jobs” is the wrong framing, and why I’m actually pretty optimistic about how humans and AI work together. It’s gonna be a good one to end on.
Until next time ...


