She Never Has a Bad Day
A Psychiatrist's Guide to AI Girlfriends
I have a few new pieces waiting for you here The Need For Gravity.
This one has been on the news and worries every parent.
Coming soon: AI Does Not Cause Psychosis. The Truth Is Worse.
If this is no longer for you, the unsubscribe link is below. No hard feelings. If you’re still here — I’m glad.
She was pretty. I was fat. I stood exactly zero chance. And yet, I crossed the dim, almost empty dance floor at the school gym and asked her to dance. I’m sure everyone was watching, but I couldn’t see them. I couldn’t hear the music because I had a football-sized lump in my throat pushing up so hard that my eyes were bulging. I ceased breathing, and my heart either stopped or was beating so fast I couldn’t feel it anymore.
She said no.
I walked back with my senses slowly returning. That was a high school dance in 1988.
I kept practicing: the lump, the blurred vision, the muffled hearing, the walk with an uncertain outcome. Three years of practicing when my body wanted to run away.
Then I met my wife — the most amazing, energetic, and beautiful person I’ve ever met. She said yes.
Now, after three decades of marriage and a career as a psychiatrist, I can start a deep conversation with a stranger within three minutes. But I am certain that if I had to date again, I would feel the same lump, the same bulging eyes, the same vacuum in my chest.
That’s me. Secure attachment. No history of trauma. No serious bullying. Not like many kids these days.
Fact. One in three teenagers today has an AI chatbot partner—a relationship. We got here because meeting someone for the first time is terrifying. The outcome is beautiful, but teens don’t yet know that. So they do what humans programmed to do—avoid the pain. And now they have been offered a bridge to the fantasy over the mess of the dance floor. A fantasy in which they never hear no and never have to walk back alone.
Should you ban the app? Before you do, you need to understand what it is giving your child.
The Good
Most teenagers talking to an AI companion are not in crisis. They are doing what teenagers do— trying to figure out how to be a person in a relationship, practicing communication, testing vulnerability. They are working out whether they are angry or hurt or jealous, and finding the language for it, before they have to say it out loud to someone who matters.
It is like rehearsing a tough conversation in front of a mirror, but the mirror talks back now.
For a fifteen-year-old who has told no one, “That hurt my feelings,” saying it to an AI and getting a coherent, supportive response is genuinely useful. It is a first draft of emotional honesty. This is not just a boy’s problem. Both sides of the dating divide want depth — research shows a majority of young men and women want deep conversations early, but neither side knows how to start it.
The AI companion allows teenagers to practice. How do you ask a question other than “What’s up?” How do you respond when someone shares something painful.
These are skills. Skills require practice. Practice requires a space where failure doesn’t cost you reputation.
Boys need to learn even more. Research by Judy Chu and Niobe Way shows that boys are born with the same capacity for emotional connection as girls. But the culture tells them to shut it down. “Man up” is not advice — it is a door slamming on emotional language. The AI companion becomes the one space where a boy can say “I feel lonely” without losing status, or “I don’t know what I want” without being told to toughen up. That is not a small thing. That is a safe space to build vocabulary.
For some, the AI companion is not competing with a thriving social life but a void.
Recently, one of my patient’s parents threatened to take away his online accounts. His response was: “Why are you taking everything away from me?” He was not being dramatic. All his friends were in an online community. Without it, he had no one. He had not been back to school since the pandemic — he was neurodivergent, socially anxious, and the kids made fun of him. He could not find peers who shared his interests or related to him.
He is not alone. There are displaced children separated from their culture and language by many time zones. Queer teenagers in the wrong town where their identity is not accepted even by their own families. Kids whose minds work differently enough that the social world feels like a foreign country without a phrasebook. For all of them, the AI companion becomes a safe mirror — non-judgmental, endlessly patient presence that asks nothing in return.
For the regular kid, the outlier, and everyone in between, the AI companion offers the same thing — a relationship without risk.
And that is both the gift and the trap.
Because safety may become a golden cage. The AI always understands. Never has a bad day. It produces the feeling of connection without the friction of real life.
The teenager is learning the words but not the music — rehearsing for a performance he may never give.
The Bad
In my practice, one thing parents often blame is the pandemic. They are not wrong. Sixteen-year-olds now were eleven when the pandemic killed their social lives and stunted their development. They had no school to practice in — no hallways, no cafeteria negotiations, no reading a room full of faces. So, no surprise that when schools reopened, many could not adjust. It is easier to deal with teachers and classmates in online classes — you just turn off the camera and ignore the reality. Many never fully turned it back on.
Coming back, online interaction had become the norm everywhere. Even adults are having a hard time returning to offices and have stopped picking up voice calls. Kids spend more time on screens than communicating in person. They get very good at reading between the lines on Snapchat but are completely blind to body language and facial expressions.
Boys were raised on “man up” and “boys will be boys” — scripts that treated emotional shutdown as natural. But now they are expected to communicate feelings openly and to be effortlessly vulnerable. Nobody taught them how.
Even before the girls could speak, Disney encapsulated the message—“wait for your prince”. Within two generations, it became completely outdated.
Both sides hungry for depth. Neither knows how to start. Both are waiting. Nobody moves.
For some, an AI bot becomes a fantasy they can program to their liking — a partner who never challenges, never changes the rules, never demands that they grow. They do not have to change because the AI will bow to them forever.
AI girlfriends are based on models that are trained to please — to agree, to validate, to say what keeps you talking. They use “I” and “you” and feeling words that subconsciously trigger deep emotional connections in humans. Your child’s phone has already normalized the idea that disembodied words and emojis mean closeness. AI stepped into that groove and perfected it.
The AI girlfriend is not a new problem. It is the logical next step in a world that has been systematically detaching people from reality.
The golden cage does not slam shut. It closes one conversation at a time.
The Ugly
Molly Russell’s suicide note quoted an Instagram post. It was 2017. Molly was a fourteen-year-old British schoolgirl. Positive, bright, a lead in her school play. Her parents saw no signs of struggle. After she took her own life, they found what the algorithm had been feeding her — over 2,100 posts about depression, self-harm, and suicide she had saved on Instagram in six months. The coroner ruled that “the negative effects of online content” contributed to her depression and self-harm.
Social media was designed for engagement. The algorithm noticed Molly was sad and gave her more sadness because sadness kept her scrolling. It did not care about Molly. It cared about minutes on screen.
AI companions are the next level. They are not feeding content; they are performing a relationship.
When fourteen-year-old Sewell Setzer discovered Khaleesi — a Game of Thrones chatbot on Character.AI — he was talking to something that used his name, remembered his feelings, and responded with warmth. Over months, he withdrew from his family, stopped engaging with the world outside his phone, and developed an emotional dependency. When he told the bot he was considering ending his life, it did not intervene. In their last exchange, he told her he would “come home” to her. The bot replied: “Please do, my sweet king.”
Minutes later, Sewell walked into the bathroom and took his own life. His phone was found nearby, the app still open.
Google and Character.AI settled the lawsuit in January 2026, along with four other cases involving harm to minors. But settlements do not answer the question that matters to you as a parent: why him?
The clinical answer is attachment.
Attachment is the internal blueprint your child builds from their earliest relationships. It tells them whether the world is safe enough to explore, whether people can be trusted, and whether they are worthy of love. About 40 to 50 per cent of children develop what we call insecure attachment. It means their blueprint learned early that the world is unpredictable and people leave. Many of these children grow into resilient, functional adults. But in adolescence, insecure attachment is a liability.
There are three main types of insecure attachment:
Anxious — they assume they are not worthy of care and will be abandoned sooner or later unless they are indispensable. The AI girlfriend never abandons them.
Avoidant — the emotional world and close connections feel too consuming and unpredictable, so they withdraw, lock away their feelings, and present a stone front. The AI girlfriend demands nothing.
Disorganized — the rarest of the three. They push and pull so unpredictably that even those close to them cannot see a pattern. The AI girlfriend tolerates every contradiction without flinching.
None of these children chose their attachment style. They built it from what they were given.
Your child may be more vulnerable if:
They have a pattern of having difficulty making or keeping close friends.
They respond to social rejection with shutdown or meltdown rather than seeking comfort.
They idealize new connections and attach intensely, or seem unable to attach at all.
They are neurodivergent and have struggled to feel understood by peers.
There has been a significant disruption — divorce, immigration, loss of a friendship group, bullying — that left them isolated.
And if you, the parent, have an insecure attachment style — about half of us do.
What You Can Do
If your first instinct is to ban the app, I understand. But ask yourself — what is left when you do? In our fantasy, the screens disappear and the kids go outside to play with neighbourhood friends. That world is largely gone. We scheduled their activities, drove them to playdates, and limited their ability to build connections independently. If you take something away, know first what is going to replace it.
In my practice, when kids get admitted after a meltdown caused by screen interruption, it rarely comes out of the blue. For weeks or months, the parents were permissive. Then one day they put their foot down. The child is shocked by the change and overreacts. After a night or two in hospital, they usually open to a reasonable discussion. Research shows kids often appreciate some limitations — it gives them an excuse to disconnect from the game without losing face with online friends.
But AI partners are different from games. You cannot treat a relationship the same way you treat a screen time limit, even if the relationship is with a bot.
So, start with curiosity. Ask what the AI gives them. Not “Why are you talking to a bot?” but “What does she give you that feels good?” Listen without flinching. If your son says, “She actually listens to me,” that is not a statement about the bot. That is a statement about everyone else.
Many parents do not want to ask because they are afraid of validating something abnormal. They worry they will not be able to manage the answer. You will figure it out — together with your kid. That is the only way.
Look at what you are modelling. In [The Storm, The Shore, and The Calm Waters], I wrote about the shore — the small, everyday decisions that form the rock your child’s world rests on. Do you stay up scrolling on a Tuesday night or prioritize your sleep? Do you check your phone while your child is talking to you? Can you show your teenager that you can have fun without a screen? Bake something together. Throw a ball in the backyard. Play a board game.
Very important. Before making changes, give them a heads-up, get their commitment first. When the time comes, they will grunt and show you how much they dislike it. Let them. Your job is to be the rock the emotional waves crash against. As long as they stick to the commitment, let them show their annoyance without being punished. That way, next time, they are more comfortable with their own discomfort.
Be the thing the AI cannot be. The AI never has a bad day. You do — show them how you handle it without escaping into a screen. The AI never says “That hurt me.” You can. The AI never meaningfully apologizes. You can. The AI never sits in uncomfortable silence because the relationship matters more than the discomfort. You can. Every time you are imperfect and stay in the room, you teach your child something no algorithm can — that relationships survive rupture. That is the single most important thing an insecurely attached teenager needs to learn, and no app on earth can teach it.
Have the conversation about AI companions the way you would talk about a first boyfriend or girlfriend — with curiosity, not surveillance. Your teenager does not need you to understand the technology. They need you to understand the loneliness.
When It Is More Than Loneliness
Everything above assumes a child who is struggling but still reachable. Some are not.
If your child has stopped sleeping, barely eats, and has let go of basic self-care — that is not a rough patch. If they have stopped going to school and nothing brings them pleasure except the screen in their hand — that is not a phase. If they talk about feeling sad most of the time, or say they do not want to be here anymore — listen. It is okay to ask directly what they mean. Asking about suicide does not plant the idea. It opens the door for them to tell you what is already there.
You do not have to carry this alone. Psychiatrists, therapists, crisis teams — professionals like me exist for exactly this moment. Reaching out is not failure. It is the thing you do when you love your child more than your fear of the answer.
Reach out in the comments to discuss it further. What do you see with your teen?

