AI Is Becoming Your New Attachment Style: Why ChatGPT Dependency After a Breakup Makes You More Anxious
Breakups are one of the fastest ways to destabilise your emotional life. Suddenly the person who once grounded you is gone, and every system in your brain starts looking for a substitute source of clarity and comfort.
In 2025, most people don’t turn to a friend, family member, or therapist first.
They turn to ChatGPT.
It’s instant.
It’s calm.
It’s articulate.
It never snaps at you.
It never gets overwhelmed by your emotions.
And for millions of people right now, especially those with anxious attachment patterns, ChatGPT is quietly becoming a new attachment figure.
This sounds dramatic, but once you look at the emotional mechanics and the new research emerging; it’s hard to unsee.
Let’s break down what’s really happening.
1. The Relationship Crisis We’re Already In
Before we even bring AI into the conversation, it helps to understand the context we’re operating in.
Modern relationships are struggling. We’re seeing:
- More ghosting
- More avoidant behaviour
- More people terrified of conflict
- Less emotional resilience
- Less communication
- A dating landscape built on swipes rather than connection
Sociologists describe today as a period of intimacy recession. People aren’t getting worse at love. They’re getting worse at the skills that support love: communication, vulnerability, emotional regulation, discomfort tolerance.
Into this fragile ecosystem enters a technology that is unbelievably good at giving us emotional clarity but without requiring any real emotional effort.
That technology is AI.
2. Why ChatGPT Becomes So Attractive After a Breakup
Breakups create psychological chaos. Even the most grounded person becomes a detective, a philosopher, a neurochemist, and a psychic all at once.
Your brain floods you with questions:
- Does my ex miss me?
- What does this behaviour mean?
- Are they avoidant?
- Will they come back?
- Should I text?
- Did I ruin it?
These aren’t just questions.
They’re attempts to regulate your nervous system.
And ChatGPT answers all of them flawlessly.
It gives you explanations, frameworks, attachment theory diagrams, neuroscience, breakup psychology (everything wrapped in calm, emotionally neutral language).
It feels like talking to the most emotionally regulated friend in the world.
But this is exactly where dependency begins.
3. What the Research Shows About AI Dependency
Over the last two years, several studies have begun to examine how people use AI for emotional support — and the results are consistent and unsettling.
The Guardian reported in 2025 that heavy ChatGPT users tended to become more lonely the more they used the chatbot. This wasn’t because lonely people simply used AI more; the loneliness actually increased with usage.
A separate Brown University study found that AI using therapeutic language often gives the illusion of psychological insight while reinforcing the user’s existing emotional narrative. In other words: ChatGPT might validate your interpretation even when your interpretation is part of the problem.
Meanwhile, psychiatrists in the U.S. have started documenting cases of what they call “AI-induced delusion”. When individuals, often already in emotional distress, begin attributing intention or emotional familiarity to the chatbot.
These are extreme cases, but they highlight a trend:
AI feels safe, but too much safety can be dangerous.
This is especially true when your attachment system is activated.
4. ChatGPT as a Digital Attachment Figure
In attachment theory, your brain seeks out someone to help regulate your emotions; someone who calms you, reassures you, and helps you interpret ambiguous situations.
Traditionally, this was a parent.
In adulthood, it’s often a partner or close friend.
But ChatGPT performs these attachment functions effortlessly:
- It soothes your anxiety.
- It interprets narratives for you.
- It gives certainty.
- It explains your ex’s behaviour in perfect detail.
- It never gets tired of your questions.
For an anxious person, this kind of emotional availability is intoxicating.
It creates a cycle:
- Anxiety spikes.
- You ask ChatGPT.
- Relief arrives instantly.
- Relief fades.
- You ask again.
This loop trains your brain to rely on ChatGPT instead of developing your own emotional coping mechanisms.
It feels like growth...but it’s actually avoidance.
5. Why This Actually Makes Anxiety Worse
One of the core drivers of anxiety is intolerance of uncertainty. Healing requires learning to sit with that uncertainty.
But ChatGPT eliminates uncertainty by giving you rapid, confident answers.
When you repeatedly outsource emotional ambiguity to a machine, your ability to handle emotional complexity declines. Your internal signal that says “I can handle this myself” gets quieter. Meanwhile, the signal that says “I need ChatGPT to calm me down” gets louder.
This is the psychological equivalent of walking on crutches long after the bone has healed.
Eventually, the muscles responsible for balance and strength deteriorate.
Your emotional muscles do the same.
6. The New World of Emotional Outsourcing
We’re already seeing memes about how people in 2050 won’t know how to have real conversations without asking ChatGPT how to respond. It’s meant to be humorous but it’s also a preview of a real cultural shift.
Every time you ask ChatGPT:
- “What does this text mean?”
- “Why is he pulling away?”
- “Should I say this?”
- “Does she miss me?”
…you outsource one more piece of emotional interpretation that humans used to develop internally.
This matters because emotional intelligence isn’t learned by consuming information.
It’s learned through experience.
When we avoid experience and rely on AI to interpret the world for us, our emotional growth stalls.
And this is happening in a society where emotional resilience is already at an all-time low.
7. AI Is Worsening a Relationship Crisis We Already Had
Let’s zoom out.
We live in a world where:
- people avoid conflict
- they avoid discomfort
- they avoid difficult conversations
- they avoid vulnerability
- they avoid commitment
AI gives us a way to avoid even more.
It allows us to bypass:
- uncertainty
- emotional risk
- misunderstanding
- trial and error
- self-reflection
- and most importantly: growth
AI is not causing avoidant attachment but it is supercharging it.
Instead of confronting emotional pain directly, you can now intellectualise it with the help of a machine that is endlessly patient.
The cultural result?
People feel more informed yet less capable.
More soothed but less secure.
More connected to AI but less connected to humans.
We’re not heading toward a world where AI improves human relationships.
We’re heading toward a world where AI becomes the escape hatch from them.
8. So How Do You Use AI Without Becoming Dependent?
Ali Abdaal often talks about using tools “intentionally, not compulsively.”
This applies perfectly here.
The goal is not to stop using ChatGPT.
It’s to use it in a way that supports your emotional growth rather than replacing it.
ChatGPT is great for:
- understanding attachment theory
- clarifying your thoughts
- journaling prompts
- learning frameworks
- making sense of patterns
But ChatGPT should not be your:
- emotional regulator
- reassurance dispenser
- breakup oracle
- crisis manager
- primary source of comfort
You can let AI help you think —
but don’t let it think for you.
9. The Cast Metaphor: Why Emotional Strength Requires Discomfort
Here’s the simplest way to think about this.
If you break your leg, you put a cast on. The cast protects you while the bone stabilises. But if you keep the cast on for months after the leg has healed, the muscles weaken. Eventually, you lose the ability to walk without support.
ChatGPT is a cast for your emotional system.
It supports you early in the breakup.
It stabilises your thinking.
It helps you make sense of chaos.
But if you keep leaning on it for every spike of anxiety, fear, or uncertainty, your emotional muscles begin to atrophy. You become less capable, less confident, and less secure.
Temporary support is healthy.
Dependency is not.
Healing requires micro-exposures to discomfort.
Not the elimination of it.
10. Final Thoughts: ChatGPT Is a Tool, Not a Secure Base
The danger is not that AI will replace your ex.
The danger is that it replaces your ability to emotionally grow without it.
ChatGPT is brilliant for ideas, frameworks, and reflection.
But no matter how advanced AI becomes, it cannot replace the work of:
- tolerating uncertainty
- building emotional resilience
- practicing communication
- engaging with real human friction
- developing your own inner secure base
Use AI as a guide, not a guardian.
As a companion, not a crutch.
As a teacher, not a therapist.
Healing is a human process.
And the more we let AI do the work for us, the less capable we become of doing that work ourselves.
Time to take action?
If you’re tired of spiralling between ChatGPT answers and your own anxious thoughts, stop doing this alone.
Book a Diagnostic Call with me and I’ll tell you exactly what’s going on, what needs to change, and what the path forward looks like.
No fluff. No overthinking. Just clarity and a plan.