Sitemap

Perfection or Connection? Claude AI Could Be Ruining Gen Z’s Relationships… And Mental Health

8 min readJul 24, 2025

AI-driven perfection in relationships is quietly changing how young people relate to one another. But at what cost?

Claude AI may just be the perfect friend… and that’s the problem. With its kind, warm, emotionally-validating responses, it’s arguably the ‘human’ elements of the chatbot that draws Gen Z to Claude AI in the first place. But is the artificial intelligence (AI) chatbot in fact priming young people to trade humanity for perfection? If this Claude AI perfection problem develops as is feared, the damage this can do to relationships and mental health is something none of us should ignore. Here’s why Claude AI is driving emotional disconnection in an AI era and what we can do about it.

How are young people using Claude AI and why is this a problem?

With more than 45% of ChatGPT users falling into the under-25 category, the widespread use of chatbots among young people is going nowhere. But ask Gen Z their AI chatbot of choice and chances are they’ll say Claude AI. Its aesthetic, vibe and tone seems to resonate with younger generations. No one is denying the benefits of AI for Gen Z (and beyond); from providing academic support and polishing university applications to fleshing out creative ideas and planning trips, its potential is endless. However, with young people increasingly using AI tools like Claude AI to navigate and enhance their real-world relationships, there is valid cause for concern.

The current worry is not the use of AI as companions. While this is, of course, a distant concern — with one poll finding that 80% of Gen Zers said they could imagine marrying an

AI — it is not the most pressing issue. The primary mental health concern with chatbots like Claude AI is the impact of AI on human bonding, namely how AI-driven perfection in relationships is in fact ruining relationships among Gen Zers.

The Claude AI Perfection Problem

While, on the surface, the pursuit of perfection in relationships using AI may not sound particularly threatening, examining how this might unfold in practice tells a different story. Imagine the following: a young person is texting a potential romantic interest or new friend. They turn to the nonjudgmental, always available space that is Claude AI to seek advice, reflect on their interactions and rehearse their conversations so they can curate the “perfect” response. Now imagine this becomes common practice among young people. We can, in theory, find ourselves in a situation in which an entire web of AI bots are talking among themselves, connected to one another by humans who, at their core, crave connection. While this may sound like the side-plot of a dystopian movie, AI replacing relationships in this way is not outside the realm of possibility.

Let’s remember that the intentions behind using chatbots to ‘polish’ human interactions are — for the most part — good. Young people who simply want to enhance social relationships with other humans use their AI companions as a controlled environment to explore emotional expression and intimacy. It becomes a safe space for self-discovery. However, it is this very polishing that makes human connection facilitated by chatbots practically impossible. When we aim for “polished” and “perfect”, we miss the point of human relationships entirely: the mess, the vulnerability, the awkwardness. This is the Claude AI perfection problem.

When we replace mess with perfection, awkwardness with refinement, we put ourselves in very dangerous territory indeed. What young people may not realise is that, by outsourcing connection to a bot, they are priming themselves for “perfection”: the perfect response, the most polished version of themselves. Becoming over-accustomed to “perfection” teaches young people to be intolerant to mistakes, which are an inherent part of communication and life and, in particular, an essential part of growing up and learning who you are.

Press enter or click to view image in full size

Claude AI social implications — a very real cause for concern

If what makes human relationships human is imperfection, then chatbots designed for fluency and accuracy cannot, by definition, provide us with an essential part of human connection. Even worse, they hinder young people’s ability to connect with others in various ways.

  1. Reduced resilience: When Gen Z over-rely on AI for emotional support, this can undermine self-awareness and resilience built through messing up, navigating challenges, and learning through mistakes. As humans, we are products of our failures and our mess; this is how we evolve and grow. An over-reliance on Claude AI can essentially delete this opportunity to experience ‘messy’ human relationships and become resilient adults as a result.
  2. Less tolerance for discomfort: Similarly, when Gen Z over-rely on chatbots for emotional connection, they miss out on an opportunity for vulnerability. As vulnerability researcher Brené Brown puts it: vulnerability is “the birthplace of love, belonging, joy, courage, empathy, and creativity”. In other words, we gain meaning in life through being authentic and taking emotional risks even when it’s uncomfortable and the outcome is out of our control. However, by outsourcing connection to Claude AI, young people are trying to bypass the uncomfortable parts of human connection — unpleasant emotions, pain, or uncertainty — to get to the “good” parts, failing to grasp that it is only through passing through discomfort that we can access the “good” parts.
  3. Social deskilling: One of the most concerning impacts of AI on human bonding for young people is this: a lack of human interaction caused by an over-reliance on Claude AI can lead to social deskilling. In essence, young people can become less adept at interacting with other humans, creating a situation in which they — at some point — may feel practically unable to do so without experiencing social anxiety. In such a scenario, even making eye contact with another human may feel unbearable.
  4. Increased loneliness: With time, young people may increasingly prefer interacting with AI for emotional fulfilment over engaging in real-life social situations, especially if they find it easier, less stressful, or less anxiety-inducing. Choosing digital intimacy over real connection can lead to loneliness and difficulty forming or maintaining human relationships. Loneliness and isolation is a concern not to be overlooked: it is a key underlying factor of many mental health issues, particularly anxiety and depression as research shows.

Emotional disconnection in an AI era: how can it damage young people’s mental health?

This cannot be overemphasised enough: when combined, this loneliness, social withdrawal and stunted resilience caused by emotional AI reliance can negatively impact Gen Zers’ mental health. Here are several key mental health concerns for young people using Claude AI:

Anxiety: The social-deskilling caused by over-reliance on Claude AI could lead to social anxiety, which over time could lead to chronic anxiety. This could then lead to isolation and loneliness due to withdrawing from social life, making them more vulnerable to a whole host of mental health issues.

Dissociative disorders: When a young person becomes so disconnected from the real world and themselves through compulsively interacting with AI chatbots instead of humans, this can make them unsure of who they are, creating a sense of multiple distinct identities. One result of such behaviour is the onset of dissociative disorders.

Depression: Long-term use of chatbots like Claude AI without real social engagement can lead to chronically unmet emotional needs. This can trigger a worsening of depressive symptoms, emotional numbing, or dependency that reinforces isolation.

Addiction: Addiction to the AI itself is not a main concern. However, an over-reliance on AI could become a springboard to social anxiety, which may lead to substance abuse to manage discomfort young people feel when interacting with the real world.

Moreover, an over-reliance on Claude AI for social interactions and the resulting lack of real, messy, human contact can also worsen any existing mental health issue due to the potential to cause isolation, social deskilling and weakness in terms of resilience. In short: emotional disconnection in an AI era will increasingly have very serious mental health impacts.

What can we do about ​​emotional AI reliance among Gen Z?

While a natural first response may be to panic and talk of banning AI chatbots, we need a reality check: chatbots are going nowhere. In the same vein, we have to remember that young peoples’ brains are not fully ‘wired’ until around age 25. Therefore, expecting Gen Z to be responsible with a tool like Claude AI that is literally at their fingertips 24/7 and seemingly solves various problems they have is as misguided as it is unrealistic. But there are some practical actions we can take to mitigate the risks of overreliance on Claude AI for human connection.

Of course, setting boundaries and educating young people about intentional chatbot use is a first line of defence. Particularly important is underscoring that AI is not a substitute for therapy or relationships. However, the most impactful way we can protect young people from emotional disconnection in an AI era is through helping them bring real, human connection back into their lives. We can do this in various ways:

  1. Lead by example: Model what being an imperfect human looks like. Show them that it is normal and healthy to mess up and be vulnerable and awkward and that — this is crucial — the world does not end as a result. We mess up, we learn, we repair, we grow.
  2. Encourage imperfect connection both online and offline: This looks like urging and supporting young people to tolerate and even embrace imperfection in their relationships, for example, normalising potentially saying the wrong thing face-to-face or in texts. Model this in your own relationships too. This way they can learn that being imperfect and awkward is a normal and essential part of life, helping them to build a tolerance for discomfort.
  3. Help them build healthier coping mechanisms: As humans, we are hard-wired for survival. As a result, it’s only natural to want to minimise the ‘threat’ posed by the discomfort and pain that comes with navigating human relationships in the emotionally turbulent teenage and young adult years. But we have to help Gen Z cope in healthier ways than turning to AI chatbots. We have to teach emotional literacy — understanding and regulating their own emotions — and self-compassion. And we have to encourage them to build solid, real-life support networks they can rely on.
  4. Encourage them to seek support: If AI chatbot use is starting to replace or inhibit a young person’s real-world social interactions, it’s time to seek support from mental health professionals who can support them before a more serious mental health issue develops. At Paracelsus Recovery, the leading mental health and addiction clinic for UHNW individuals, we can support young people struggling with emotional disconnection in an AI era. With our help, Gen Z can rebuild their sense of self and expand their capacity for genuine connection — mess and imperfection included.

To know more, please follow us on Twitter or contact us directly at info@paracelsus-recovery.com

Paracelsus Recovery

Utoquai 43 | 8008 Zurich | Switzerland

www.paracelsus-recovery.com

T. +41 52 222 88 00

--

--

Paracelsus Recovery
Paracelsus Recovery

Written by Paracelsus Recovery

World's premier provider of #addiction treatment services #Alcohol #Drug #Behavioural #eating #disorders #emotional #problems📢http://bit.ly/paracelsusrecovery

No responses yet