A growing trend of young people using chatbots to navigate relationships is raising serious concerns about emotional development and authentic human connection
It was 2 a.m. on a Monday when Emily received a long, carefully worded text from Patrick — a Yale University junior she had gone on a blind date with just two days earlier. The message was warm, clear, and impressively articulate. There was just one problem: most of it was written by ChatGPT.
Patrick went back and forth with the chatbot and "tweaked certain lines here and there, but it was mostly copy and paste" from ChatGPT. He even added an emoji to try to make it sound more human.
Patrick is not an anomaly. He is part of a rapidly growing trend.
A Generation Turning to AI for Human Moments
Researchers say a growing number of young people are turning to AI to navigate social situations — drafting rejection texts, decoding mixed signals, and scripting difficult conversations. Experts warn that this habit may be stunting emotional growth, leaving an already isolated generation who came of age during the pandemic even less prepared for the messiness of human connection.
The numbers back this up. One-third of teens already prefer AI companions over humans for serious conversations, according to a 2025 survey conducted by Common Sense Media.
The Two-Sided Problem
Experts say the issue goes deeper than just authenticity. If you are using AI to draft messages to friends or romantic partners, you are outsourcing the communicative act itself. It creates an "expectation mismatch" since the recipient is responding to an AI-polished version of their friend and not the actual person.
But the consequences do not stop there. Repeated use can erode users' confidence in their own voices, preventing young adults from developing essential skills such as reading social intent, inferring others' emotions, and tolerating ambiguity in social interactions. It has implications for your sense of self, advocacy, and identity formation.
AI Cannot Replicate Real Relationships
Psychologists stress that the friction in human relationships is not a flaw — it is a feature. Relationships and conversations can be messy and probably should be messy, and that is part of what makes you more socially competent in the long run. AI companions are designed to be very validating and agreeable, so their feedback does not reflect the friction that is part of how people respond in real relationships.
There is also a fundamental limitation in what AI can understand. Social contexts are often not entirely objective — they are contextual, relational, and nuanced. As confident as a chatbot may sound, it is searching for a through line in something that may not have one.
What Parents Should Watch For
For parents, experts recommend watching for warning signs including social withdrawal, declining grades, or a growing preference for AI over human interaction.
Meanwhile, a broader study adds to the concern. Young adults describe AI use as outsourcing thought, skipping the struggle that builds skill, and replacing dialogue with solitary prompts — with one respondent noting that chatbots allow you to access information, not process it.
The Bottom Line
Technology has always changed how humans communicate — from letters to texts. But experts argue that using AI to speak for you in emotional moments is fundamentally different. It does not just change the medium. It removes the human entirely.
The question Gen Z must now grapple with is not whether AI can write a better message than they can. It almost certainly can. The real question is: at what cost?







