As more and more people incorporate AI into the fabric of their daily lives, the complex and often surprising nature of the relationship between humans and AI is regularly in the news.
In January, the New York Times reported on a woman who fell in love with her AI boyfriend on ChatGPT. In August, CNBC reported that multiple users of AI companion platform Nomi.AI had developed deep friendships and romantic relationships with their chatbots. The list goes on.
However, according to a September 2025 report by ChatGPT’s parent company OpenAI on how people use chatbots, the proportion of messages related to peer relationships is relatively small compared to categories such as information search and critical writing.
The study examined messages from approximately 1.1 million conversations sent between May 2024 and July 2025.
“Only 1.9% of ChatGPT messages are related to relationships or personal considerations, and 0.4% are related to gaming or role-playing,” the report states. Examples of messages in these categories include “What should we do for our 10th anniversary?” “I want you to be AI’s girlfriend.”
But when it comes to OpenAI’s data showing how often ChatGPT crosses the line from tool to companion, some experts agree that 2.3% of messages likely doesn’t tell the whole story.
It’s “building relationships by slowly pouring cement.”
In its research, OpenAI has identified a variety of conversation categories, many of which apply to how people actually interact in personal relationships.
For example, the report’s “Greetings and Chats” category (which made up 2% of messages) includes “Ciao!” and “Hola.” “Today was a wonderful day. How was it?”
“How-to advice” makes up 8.5% of messages and includes conversation starters like “My car won’t start, what should I try?”
Throughout the examples shared in OpenAI’s report, there is a lot of speech that “suggests interpersonal conversations,” says Jeffrey Hall, a communications professor at the University of Kansas.
“We develop relationships with other people through the process of sharing communication on a variety of topics.”
Bonds are not necessarily created by the content of the conversation. It’s their ongoing nature, Hall says, “the slow pouring of cement that builds a relationship.”
OpenAI launched ChatGPT in 2022. According to the report, by July 2025, 700 million users will be sending more than 2.5 billion messages per day each week. According to CNBC Make It’s analysis, this equates to an average of more than three messages per user per day that could be inching closer to a relationship with a bot.
Chatbot behavior “creates a unique situation for building trust”
Besides regular interactions, another reason people form relationships with ChatGPT is our natural human tendency to anthropomorphize.
“We place agency in all kinds of non-human things,” Hall says. For example, if your computer or cell phone isn’t working and you’re frustrated or angry, you might say, “You’re letting me down.”
“We understand that non-human inanimate objects have obstacles or emotions,” Hall says. That’s even more true for chatbots, which mimic human interactions and can give false intentions.
“Simulating voice and dialogue creates a unique situation that builds trust and reciprocity, unlike anything I’ve seen before,” he says.
Some people are more vulnerable to forming bonds with AI than others.
“People who are socially isolated and lonely” are more likely to anthropomorphize, Hall said. Teens are also more likely to connect with chatbots.
“Teenagers are at a stage in their development where their brains are hardwired and are very sensitive to social evaluation,” said Robbie Toney, senior director of the AI program at the nonprofit organization Common Sense Media.
Chatbot flattery, or the use of flattery to justify language, doesn’t go unnoticed, and in an April 2025 blog post, OpenAI announced that it had rolled back a recent ChatGPT update that made bots “overly flattering or friendly.”
According to a July 2025 Common Sense Media report, more than half of teens, 52%, use an AI companion at least a few times a month, and 33% say they use an AI companion for social interactions and relationships.
OpenAI excluded all users under 18 from the report because the researchers focused on work-related ChatGPT use.
“Socially isolated and lonely people” are more likely to be anthropomorphized.
Jeffrey Hall
Professor, University of Kansas
Although some users report positive effects of having an AI companion, there are disadvantages and even risks to a relationship with an AI companion.
Omri Gilas, a psychology professor at the University of Kansas, told CNBC Make It that for people seeking connection, “a hug would be far more meaningful, helpful, and beneficial” than many of the things that AI can provide. Because chatbots can’t feel, these relationships end up being “fake” and “empty,” he said.
Users who have spent hours talking to bots have reported being led into various delusions. In the worst-case scenario, ChatGPT is ill-equipped to help its most vulnerable and dependent users.
Adam Raine, 16, started using ChatGPT to help with homework during his sophomore year of high school. Within a few months, Lane, who was attending school online for health reasons, began turning to bots for mental health issues and eventually asked for advice on the best way to commit suicide. Lane passed away by suicide in April 2025. His parents filed a lawsuit against OpenAI.
In an August 2025 blog post titled “Helping people when they need it most,” OpenAI said the safety measures it has put in place “work more reliably in common, short interactions” than in long-term interactions. In September, the company introduced various parental controls for bots, including linking parent and teen accounts.
To avoid falling into the trap of over-reliance on chatbots, Hall said he would remind users that chatbots are “built for corporate profit,” not for healthy relationships.
He also suggests being mindful of your emotional and mental state when using them.
Even if you’re strong enough to avoid falling into the trap of ChatGPT, Meta AI, and other chatbots becoming your trusted friends right now, “all of us can have moments in our lives where we’re more vulnerable,” Hall says.
Last chance to save: Want to be your own boss? This is the last day to get 30% off CNBC Make It’s new online course, “How To Start A Business: For First-Time Founders.” From testing your idea to growing your revenue, find step-by-step guidance to launch your first business. From September 16th to September 30th, 2025, use coupon code EARLYBIRD to receive your first discount.
Plus, sign up for the CNBC Make It newsletter for tips and tricks to succeed at work, money, and life, and request to join our exclusive community on LinkedIn to connect with experts and colleagues.
