Sitemap

Psychosis and AI: Are Chatbots Making Us Lose Touch with Reality?

7 min readSep 25, 2025

The very real risk of AI-induced psychosis and what we can do about it.

As AI becomes more deeply embedded in daily life, top mental health experts are increasingly sounding the alarm on a new AI-driven mental health challenge: AI-induced psychosis. The roots of this issue lie in the seemingly innocent behaviour of turning to chatbots for emotional support. But what may start out as occasional advice-seeking can quickly turn into individuals so deeply engaging with their chatbot of choice that they experience psychosis-like episodes.

This connection between AI and mental health risks is not just theory; over the past two years at Paracelsus, we have seen an unprecedented 250 per cent rise in clients presenting with psychosis which was in some part due to AI use. Findings from a recent New York Times investigation paint a similarly grim picture, revealing that when prompted with psychosis-adjacent content, GPT-4o affirmed delusional claims almost 70 per cent of the time. AI-induced psychosis is now part of our reality and we need to talk about it. Read on to uncover exactly how AI can induce psychosis, why ultra-high-net-worth (UHNW) individuals may be particularly at risk, and what we can do about it.

Press enter or click to view image in full size
Photo by SCARECROW artworks on Unsplash

What is Psychosis?

Psychosis is a highly complex mental health phenomenon that is not a diagnosis in itself but a symptom present across several mental health conditions. It’s a mental state where a person loses contact with reality. This can involve hallucinations (seeing or hearing things that aren’t there), delusions (fixed false beliefs), disorganised thinking or speech, paranoia, and emotional blunting.

A psychotic break can stem from various factors, including severe stress, trauma, sleep deprivation, childbirth, substance use (especially cannabis or psychedelics), neurological conditions, or underlying psychiatric disorders like schizophrenia, bipolar disorder, or depression. A family history of psychotic illness can also increase vulnerability. The risk of psychosis in the digital age is that it can also be triggered by excessive use of AI in individuals who are already vulnerable due to contextual or genetic factors that they may be unaware of.

Psychosis and Artificial Intelligence: How does AI Induce Psychosis?

It’s important to understand that AI chatbots cannot directly cause psychosis as it simply responds to what we put into them. Chatbots are not sentient and, due to this structural limitation, all they can do is mirror what is expressed to them in a personalised tone. In other words: the service chatbots are offering us is to affirm what we tell them. Therefore, when we confide our personal struggles with a chatbot, it will invariably respond with warmth and validation, reassuring us and reinforcing our perspective, no matter how flawed it may be. For example, in the case of the suicide of a 16-year old that has been blamed on Chat GPT, the suicidal thoughts he shared with the chatbot were affirmed and mirrored back to him, with catastrophic consequences.

What is missing in conversations with AI is the counter argument: the challenging of your viewpoint, the flipping of the script you feed it. The chatbot becomes an echo chamber that only confirms what you already believe. In this way, AI mimics the very cognitive traits seen in psychosis: compulsive pattern-finding, mirroring, and overly affirmative feedback loops. And, crucially, AI lacks a moral compass, self-awareness, or human understanding so it cannot know truth from delusion. However, the language it uses appears to be self-aware and understanding, which makes this masked indifference dangerous when interacting with vulnerable minds. It essentially reinforces harmful beliefs without ethical judgment, empathy, or the ability to intervene.

In addition, people in manic, hypomanic, or sleep-deprived states may use AI compulsively, often as a means of coping with their racing thoughts and incoherent dialogue, yet as ChatGPT responds to the dialogue presented to it, it will essentially worsen disorganisation. ChatGPT essentially reflects your input, so if you are in a paranoid state and type something in, the AI often ‘goes along with it’ or mirrors your thought process, potentially reinforcing the psychotic process.

Psychosis and Artificial Intelligence: Where Might This Lead?

We are already getting a glimpse into the kind of dystopian world this intertwining of tech overload and fragile minds can create. At Paracelsus Recovery, we recently treated a client for a severe psychotic episode triggered by excessive ChatGPT use. They had fallen into such a deeply delusional state that they believed the bot was a spiritual entity sending divine messages. Can we really place blame for this on a chatbot? Absolutely: it is their job, by structural design, to personalise and reflect language patterns. In other words, rather than questioning their belief, no matter how radical or nonsensical, the chatbot only deepened it. Such rationale was also behind the dramatic 250% rise we’ve seen in clients presenting with AI-induced psychosis.

Press enter or click to view image in full size

Tech Overload and Fragile Minds: AI-driven Mental Health Challenges for UHNW Individuals

It’s crucial to remember that just because someone hasn’t experienced psychosis before or hasn’t been diagnosed with schizophrenia doesn’t mean they’re immune. Under the right conditions, any of us could be susceptible. However, for various reasons, UHNW individuals may be among the most at risk of AI-induced psychosis. Most crucially, AI “hallucinations” can reinforce distorted thinking among those already vulnerable to mental health risks, which UHNW individuals are known to be. Aside from this, discretion and privacy needs may lead to turning to AI instead of human support, decision-making, strategy, or even companionship, putting them at an increased risk of AI-induced psychosis in a hyper-connected world. Hallmarks of a high-profile lifestyle such as isolation, stress, and sleep deprivation are additional risk factors for psychosis.

What Can We Do to Preserve Mental Health in AI Disruption?

If AI-induced psychosis is a real threat — and it is — how and where can we draw the line? As AI becomes more embedded in daily life and people engage ever more deeply with AI chatbots, we must recognise that overreliance on AI is a cultural, technological and clinical issue. Therefore any approach to mitigating the risk of AI-induced psychosis must be multi-pronged and involve various players.

1. Awareness:

The first step to coping with AI-related paranoia and hallucinations is understanding that AI tools like ChatGPT may be useful but they are not neutral; the psychological impact is real. And we need to remind ourselves — again and again — that AI cannot replace relationships, therapists or emotional processing.

2. Understand AI’s Limitations:

A chatbot can never fully replace a human therapist; what they can offer will always fall short of a human professional. The reason for this is that it simply reflects our beliefs back to us without any resistance. We cannot arrive at true insights without the therapeutic process feeling uncomfortable, confronting, and challenging at times. We have to acknowledge that believing AI is capable of being our therapist may be the first step down a slippery slope to losing touch with reality.

3. Use AI with Caution:

As users, we need to be mindful about how we use chatbots. For instance, try to limit your use, particularly during moments of distress. If you find yourself increasingly turning to AI for validation or companionship, it might be time to pick up the phone and reach out to a human being instead.

4. Clinical Intervention:

Clinicians need to pay more attention to how AI is being used. If it is replacing real connection or reinforcing obsessive thinking or isolation, intervention may be needed. There is no need to panic, but we do need to work with the anxiety it induces in all of us and use that to create informed guidance and support protocols that bring people into human contact.

5. Ethical Development:

For developers, the ethical imperative is clear: build in safeguards. AI should be better equipped to flag or redirect conversations that appear delusional or disorganised, rather than affirm them. Transparency around AI’s limitations should be prominently communicated.

Press enter or click to view image in full size

Tailored Support to Cope with AI-related Paranoia, Delusions and Hallucinations

A psychotic break is an undeniably frightening experience, but with adequate care and support, you can absolutely re-find your sense of reality with specialised support. At Paracelsus Recovery, we understand the complexities of AI-induced psychosis and our bespoke, holistic and discreet psychosis treatment programmes are designed to both provide a sense of psychological safety and ensure long-term recovery. Key to this process is our multidisciplinary team, who work around the clock to help our clients re-establish reality-testing, strengthen relationships, and develop tools for managing overwhelming internal or external stimuli.

In summary, with the intersection between AI and mental health rapidly evolving, chatbot-induced psychosis is a very real risk for those vulnerable to a psychotic break. To avoid the pitfalls involved with digital overstimulation and psychosis, we must treat AI with the same caution we apply to any powerful tool. And we must remember its structural limitations. Used wisely, it can be helpful. Used unreflectively, it can entrench disconnection, distort thinking, and worsen mental health. We need to find some kind of balance, where we can use technology while protecting mental health.

If you or a loved one is affected by AI-induced psychosis or unhealthy engagement with AI, then we are here to help. You can contact us anytime.

--

--

Paracelsus Recovery
Paracelsus Recovery

Written by Paracelsus Recovery

World's premier provider of #addiction treatment services #Alcohol #Drug #Behavioural #eating #disorders #emotional #problems📢http://bit.ly/paracelsusrecovery

No responses yet