Hey ChatGPT, is my boyfriend gaslighting me? It’s not the sort of question you'd expect to pose to a chatbot, yet increasingly, it’s exactly what people are doing.
In 2025, a growing number of us are turning to artificial intelligence not just for help with emails or meal ideas, but for something far trickier: relationship advice.
Whether it’s decoding cryptic texts, analysing emotional red flags, or — in one now-viral case — using AI-assisted tasseography and filing for divorce after ChatGPT allegedly concluded her husband was "fantasising about other women," AI is fast becoming the modern-day agony aunt.
Reddit threads dedicated to ChatGPT’s romantic guidance are exploding, suggesting a shift in how we approach emotional dilemmas.
"Throughout the past three months, I’ve been using ChatGPT to help me solve and analyse any issues, situations, and day-to-day activities about the relationship that I needed help with," one person shared, sparking a heated debate with one calling it "peak 2025".
In a separate thread, another Redditor hailed the chatbot "an excellent no BS relationship counsellor".
iStock
But is this just a harmless extension of our digital lives — or are we handing over our deepest feelings to something that doesn’t even have any?
Clarissa Silva, a renowned behavioural scientist, researcher, relationship coach and founder of Your Happiness Hypothesis (H20), believes that the appeal perhaps comes down to emotional safety, an outlet where "people can express their feelings without fear of rejection or judgment".
Unlike friends or family, who often come with emotional history, expectations, or unsolicited reminders of past mistakes, AI feels neutral.
"Your friends and family will be the first to tell you about your past failed relationships and want you to avoid being hurt because they already think you have a past performance of unhealthy options," Silva tells Indy100. "That is not very motivational for you to begin with. It creates the opposite effect: defensiveness."
iStock
This perception of emotional neutrality, she explains, is also echoed in research.
While earlier studies suggested chatbots could be perceived as empathetic problem-solvers, Silva’s own findings reveal a more complex picture. "Our research showed that it is a band-aid concealing the core drivers of why people need issues resolved to achieve their goals,” she explains.
In other words, while AI might offer short-term comfort or clarity, it often doesn’t address the deeper emotional patterns or belief systems at play.
So, could relying on a chatbot for emotional guidance be helpful — or does it risk oversimplifying complex human relationships?
According to Silva, it's essential to understand how large language models (LLMs) like ChatGPT are actually being used.
"It is acting as a social media app for many," she says. "Every social media app becomes a dating app — LLMs aren’t the exception."
While these tools can offer immediate support and feel accessible in moments of stress or confusion, they can’t replicate the full spectrum of human empathy or grasp the nuance of emotional dynamics.
Crucially, she argues, genuine growth in relationships often comes from self-awareness — something AI can’t cultivate for us.
“Relationship failures lead to success after understanding the failures,” Silva says.
iStock
Even if you fed ChatGPT your entire dating history, it wouldn’t be able to spot your own blind spots or unconscious patterns, as Silva highlights: “The number one person we lie to is ourselves.”
That emotional reliance on AI may also be tapping into something deeper: a growing sense of relationship anxiety, amplified by the hyper-connected, hyper-exposed nature of modern life.
With people now accustomed to documenting everything — from fitness routines to heartbreaks — the line between sharing for support and oversharing for validation has blurred. Silva sees this as part of a broader trend that accelerated during the COVID-19 pandemic.
“Since the pandemic, we’ve seen a higher dependence on LLMs and machine learning to help reduce anxiety and depression,” she explains.
While these tools were useful for coping with the unique pressures of that time, they didn’t address the deeper, unmet emotional needs that existed long before lockdowns: "Oversharing was a way for many to cope — as time went on, it became crack. But the need was always there."
She cautions against expecting AI to untangle the full complexity of romantic struggles. What it can do, however, is offer broad reassurance — easing anxiety around past decisions, or validating feelings in the moment.
“You have to question whether you simply want something that affirmed your choices were suboptimal, or if you can act on new strategies that change that pattern," Silva shares.
That said, relying on AI for emotional clarity isn’t without its risks — especially when it comes to delicate issues like trust, betrayal, or infidelity.
iStock
The danger lies in interpreting AI responses as emotional truths, rather than what they are: algorithmically generated suggestions based on patterns and probabilities.
“FOMO affects relationship decision-making and life perception,” Silva explains. “But there is a sequence of factors that go into our decision-making: FOMO, self-esteem, past trauma, childhood trauma, intergenerational trauma, jealousy, and frustration.”
When people ask AI to validate suspicions or decode emotional intent, they may unknowingly bring raw, undigested emotions into the equation — layers of experience that large language models simply aren’t equipped to process.
At its core, this trend may be a reflection of something more systemic: a decline in how we communicate and connect on a human level, as "we’ve created extensions of communication that already dismiss the communicator".
Social media has long encouraged us to curate our emotions — to post polished highs or dramatic lows, often to an audience of strangers — but over time, this performative cycle has eroded authenticity.
“We share positive emotions but avoid expressing negative emotions on social media — or we share all emotions and get negative feedback,” she says.
The result? A digital landscape shaped by doomscrolling, trolling, and what Silva calls the rise of the "armchair overnight expert".
These habits don’t stay online — they bleed into our romantic lives, distorting how we date, speak, and connect.
"Dating apps, fatigue, etc, then get perceived as a marketplace instead of an opportunity," she adds. And in that marketplace, genuine vulnerability becomes harder to locate — even with a chatbot ready to listen.
As AI becomes increasingly woven into the fabric of our emotional lives, it's clear that tools like ChatGPT can offer comfort, perspective, and even clarity — but they aren’t a replacement for human connection, self-reflection, or emotional growth.
In a world obsessed with instant answers, perhaps the real work still lies in asking ourselves the harder questions — and being brave enough to sit with the uncomfortable, human answers.
You may also like...
- ChatGPT just broke up a 12-year marriage after it claimed husband's habit proved he was cheating
- I asked ChatGPT to help 'improve' my appearance - and it's a game-changer
Sign up to our free Indy100 weekly newsletter
How to join the indy100's free WhatsApp channel
Have your say in our news democracy. Click the upvote icon at the top of the page to help raise this article through the indy100 rankings.