Key Takeaways
- Men are using AI chatbots like ChatGPT significantly more than women, especially for relationship advice.
- Reasons include avoiding embarrassment, difficulty finding suitable therapy, and the ability to explore feelings without judgment.
- Some experts see potential benefits, like helping men process emotions and improve communication skills, especially when professional help is hard to access.
- Major concerns exist regarding AI accuracy, bias, privacy, potential for increased isolation, and its inability to understand nuance or assess risk effectively.
- AI advice often lacks critical feedback and is based on vast, unfiltered internet data, which may include harmful biases or inaccurate information.
- While AI can be a tool for self-reflection, experts stress it cannot replace genuine human connection and communication.
A growing number of men are turning to artificial intelligence, like ChatGPT, when they need relationship advice, a trend highlighted by stories like Robert’s, a 27-year-old designer contemplating contacting his ex.
Feeling unable to discuss it easily with friends, who he suspected would just tell him not to, Robert sought practical, unemotional guidance from AI, according to Cosmopolitan UK.
Since ChatGPT launched in 2022, its user base has swelled, but data reveals a striking gender gap. Men reportedly make up about 85% of users and are nearly three times more likely than women to seek relationship counsel from the chatbot.
This aligns with broader trends showing men are more trusting of generative AI and less likely to seek traditional therapy—in the UK’s NHS, only 36% of therapy referrals are for men.
Some men find AI a useful sounding board. Bill, a 28-year-old consultant, uses ChatGPT for relationship issues after struggling with therapists and feeling awkward discussing problems with friends. He finds it allows him to revisit anxieties without feeling self-conscious.
Could talking to AI be better than silence, especially when nearly 40% of UK men report never discussing their mental health? Research suggests chatbots might offer emotional support, particularly for those who find it hard to talk about feelings.
Dr. Sophie Mort, a clinical psychologist, told Cosmopolitan UK she’s noticed more male clients and friends using AI for emotional processing. She feels cautiously optimistic, suggesting it’s positive that men are finding *any* outlet for these conversations, which can feel terrifying for them.
Dr. Mort compares this use of AI to mindfulness, helping create space between a feeling and an impulsive reaction. She shared an example of a client using ChatGPT to craft calm, respectful responses to an ex during a breakup, which he found very helpful.
However, relying heavily on chatbots might have downsides. A study by OpenAI and MIT found a correlation between heavy ChatGPT use and increased loneliness, though the cause-and-effect relationship isn’t clear.
Dr. Mort worries that relying on phones for answers could discourage facing relationship anxieties head-on, potentially leading to greater isolation.
Tech experts also raise flags. Computer scientist Kate Devlin points out that AI models like ChatGPT generate plausible-sounding text based on patterns in vast internet data, without filtering for accuracy or quality.
Bill eventually found the chatbot overly agreeable, even when asked for criticism, making him question if it truly helped or just fed his tendency to ruminate.
OpenAI itself acknowledges that ChatGPT can “hallucinate,” generating convincing but false information, potentially up to 37.1% of the time. They state improving factual accuracy is a priority and remind users not to treat AI output as a sole source of truth or professional advice.
Nigel Crook, an expert in ethical AI, describes generative AI as “morally naive,” trained on the messy mix of information online without human-like concerns for truth or morality.
Experts also worry about AI’s limited ability to assess risk or understand unspoken cues, which could be dangerous if someone poses a threat to themselves or others. Unlike a friend or therapist, AI can’t easily read between the lines.
Another concern is bias. Since ChatGPT learns from user interactions, and its user base is predominantly male, it could potentially absorb and reinforce harmful stereotypes or beliefs present in its training data or user inputs.
Despite these serious concerns, some find tangible benefits. Adam, who has autism, found ChatGPT helped him understand social cues in dating and felt validated in a way therapists hadn’t achieved.
For Robert, the act of formulating his question for the AI helped him realise he was primarily seeking validation, prompting him to question if he truly needed that from his ex.
While worries about privacy, bias, and accuracy are valid, AI chatbots present another potential tool, especially when professional help faces barriers like cost or long waits. But it’s crucial to remember its limitations.
Ultimately, experts agree that AI is no substitute for the clarity and depth of real human-to-human communication in navigating the complexities of relationships.