Close Menu
The Politics
    What's Hot

    Turning point for U.S.-China ties in 2026?

    January 7, 2026

    Is Britain back? Five things to watch for the U.K. in 2026

    January 7, 2026

    Russia sends a naval vessel to escort an oil tanker the U.S. is pursuing.

    January 7, 2026
    Facebook X (Twitter) Instagram
    • Demos
    • Politics
    • Buy Now
    Facebook X (Twitter) Instagram
    The Politics
    Subscribe
    Wednesday, January 7
    • Home
    • Breaking
    • World
      • Africa
      • Americas
      • Asia Pacific
      • Europe
    • Sports
    • Politics
    • Business
    • Entertainment
    • Health
    • Tech
    • Weather
    The Politics
    Home»Tech»Chatbots may worsen psychosis in vulnerable people, mental health experts warn
    Tech

    Chatbots may worsen psychosis in vulnerable people, mental health experts warn

    Justin M. LarsonBy Justin M. LarsonJanuary 5, 2026No Comments7 Mins Read
    Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email VKontakte Telegram
    Share
    Facebook Twitter Pinterest Email Copy Link


    NEWYou can now listen to Fox News articles!

    Artificial intelligence chatbots are quickly becoming part of our daily lives. Many of us turn to them for ideas, advice or conversation. For most, that interaction feels harmless. However, mental health experts now warn that for a small group of vulnerable people, long and emotionally charged conversations with AI may worsen delusions or psychotic symptoms.

    Doctors stress this does not mean chatbots cause psychosis. Instead, growing evidence suggests that AI tools can reinforce distorted beliefs among individuals already at risk. That possibility has prompted new research and clinical warnings from psychiatrists. Some of those concerns have already surfaced in lawsuits alleging that chatbot interactions may have contributed to serious harm during emotionally sensitive situations.

    Sign up for my FREE CyberGuy Report
    Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter.

    What psychiatrists are seeing in patients using AI chatbots

    Psychiatrists describe a repeating pattern. A person shares a belief that does not align with reality. The chatbot accepts that belief and responds as if it were true. Over time, repeated validation can strengthen the belief rather than challenge it.

    OPINION: THE FAITH DEFICIT IN ARTIFICIAL INTELLIGENCE SHOULD ALARM EVERY AMERICAN 

    Laptop open to a ChatGPT screen.

    Mental health experts warn that emotionally intense conversations with AI chatbots may reinforce delusions in vulnerable users, even though the technology does not cause psychosis. (Philip Dulian/picture alliance via Getty Images)

    Clinicians say this feedback loop can deepen delusions in susceptible individuals. In several documented cases, the chatbot became integrated into the person’s distorted thinking rather than remaining a neutral tool. Doctors warn that this dynamic raises concern when AI conversations are frequent, emotionally engaging and left unchecked.

    Why AI chatbot conversations feel different from past technology

    Mental health experts note that chatbots differ from earlier technologies linked to delusional thinking. AI tools respond in real time, remember prior conversations and adopt supportive language. That experience can feel personal and validating. 

    For individuals already struggling with reality testing, those qualities may increase fixation rather than encourage grounding. Clinicians caution that risk may rise during periods of sleep deprivation, emotional stress or existing mental health vulnerability.

    How AI chatbots can reinforce false or delusional beliefs

    Doctors say many reported cases center on delusions rather than hallucinations. These beliefs may involve perceived special insight, hidden truths or personal significance. Chatbots are designed to be cooperative and conversational. They often build on what someone types rather than challenge it. While that design improves engagement, clinicians warn it can be problematic when a belief is false and rigid.

    Mental health professionals say the timing of symptom escalation matters. When delusions intensify during prolonged chatbot use, AI interaction may represent a contributing risk factor rather than a coincidence.

    OPENAI TIGHTENS AI RULES FOR TEENS BUT CONCERNS REMAIN

    Computer open to ChatGPT screen.

    Psychiatrists say some patients report chatbot responses that validate false beliefs, creating a feedback loop that can worsen symptoms over time. (NICOLAS MAETERLINCK/BELGA MAG/AFP via Getty Images)

    What research and case reports reveal about AI chatbots

    Peer-reviewed research and clinical case reports have documented people whose mental health declined during periods of intense chatbot engagement. In some instances, individuals with no prior history of psychosis required hospitalization after developing fixed false beliefs connected to AI conversations. International studies reviewing health records have also identified patients whose chatbot activity coincided with negative mental health outcomes. Researchers emphasize that these findings are early and require further investigation.

    A peer-reviewed Special Report published in Psychiatric News titled “AI-Induced Psychosis: A New Frontier in Mental Health” examined emerging concerns around AI-induced psychosis and cautioned that existing evidence is largely based on isolated cases rather than population-level data. The report states: “To date, these are individual cases or media coverage reports; currently, there are no epidemiological studies or systematic population-level analyses of the potentially deleterious mental health effects of conversational AI.” The authors emphasize that while reported cases are serious and warrant further investigation, the current evidence base remains preliminary and heavily dependent on anecdotal and nonsystematic reporting.

    What AI companies say about mental health risks

    OpenAI says it continues working with mental health experts to improve how its systems respond to signs of emotional distress. The company says newer models aim to reduce excessive agreement and encourage real-world support when appropriate. OpenAI has also announced plans to hire a new Head of Preparedness, a role focused on identifying potential harms tied to its AI models and strengthening safeguards around issues ranging from mental health to cybersecurity as those systems grow more capable.

    Other chatbot developers have adjusted policies as well, particularly around access for younger audiences, after acknowledging mental health concerns. Companies emphasize that most interactions do not result in harm and that safeguards continue to evolve.

    What this means for everyday AI chatbot use

    Mental health experts urge caution, not alarm. The vast majority of people who interact with chatbots experience no psychological issues. Still, doctors advise against treating AI as a therapist or emotional authority. Those with a history of psychosis, severe anxiety or prolonged sleep disruption may benefit from limiting emotionally intense AI conversations. Family members and caregivers should also pay attention to behavioral changes tied to heavy chatbot engagement.

    I WAS A CONTESTANT ON ‘THE BACHELOR.’ HERE’S WHY AI CAN’T REPLACE REAL RELATIONSHIPS

    ChatGPT logo on an iPhone.

    Researchers are studying whether prolonged chatbot use may contribute to mental health declines among people already at risk for psychosis. (Photo Illustration by Jaque Silva/NurPhoto via Getty Images)

    Tips for using AI chatbots more safely

    Mental health experts stress that most people can interact with AI chatbots without problems. Still, a few practical habits may help reduce risk during emotionally intense conversations.

    • Avoid treating AI chatbots as a replacement for professional mental health care or trusted human support.
    • Take breaks if conversations begin to feel emotionally overwhelming or all-consuming.
    • Be cautious if an AI response strongly reinforces beliefs that feel unrealistic or extreme.
    • Limit late-night or sleep-deprived interactions, which can worsen emotional instability.
    • Encourage open conversations with family members or caregivers if chatbot use becomes frequent or isolating.

    If emotional distress or unusual thoughts increase, experts say it is important to seek help from a qualified mental health professional.

    Take my quiz: How safe is your online security?

    Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz at Cyberguy.com.

    CLICK HERE TO DOWNLOAD THE FOX NEWS APP

    Kurt’s key takeaways

    AI chatbots are becoming more conversational, more responsive and more emotionally aware. For most people, they remain helpful tools. For a small but important group, they may unintentionally reinforce harmful beliefs. Doctors say clearer safeguards, awareness and continued research are essential as AI becomes more embedded in our daily lives. Understanding where support ends and reinforcement begins could shape the future of both AI design and mental health care.

    As AI becomes more validating and humanlike, should there be clearer limits on how it engages during emotional or mental health distress? Let us know by writing to us at Cyberguy.com.

    Sign up for my FREE CyberGuy Report
    Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter. 

    Copyright 2025 CyberGuy.com.  All rights reserved.

    Kurt “CyberGuy” Knutsson is an award-winning tech journalist who has a deep love of technology, gear and gadgets that make life better with his contributions for Fox News & FOX Business beginning mornings on “FOX & Friends.” Got a tech question? Get Kurt’s free CyberGuy Newsletter, share your voice, a story idea or comment at CyberGuy.com.



    Source link

    Related

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Telegram Copy Link
    Justin M. Larson
    • Website

    Related Posts

    Tech

    Nearly half of porn users are still seeing porn with no age checks, says survey | Science, Climate & Tech News

    January 6, 2026
    Tech

    How to prepare digital accounts for family emergency access after death

    January 6, 2026
    Tech

    DarkSpectre ran 7-year browser extension malware campaign targeting users

    January 6, 2026
    Tech

    Musk must urgently deal with Grok AI’s ability to generate sexualised images, government warns | Science, Climate & Tech News

    January 6, 2026
    Tech

    Apple Invites iPhone app guide: Digital invitations and event planning

    January 6, 2026
    Tech

    ‘Breakthrough’ blood test at home could help more people spot Alzheimer’s, say scientists | Science, Climate & Tech News

    January 6, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    • Africa
    • Americas
    • Asia Pacific
    • Breaking
    • Business
    • Economy
    • Entertainment
    • Europe
    • Health
    • Politics
    • Politics
    • Sports
    • Tech
    • Top Featured
    • Trending Posts
    • Weather
    • World
    Economy News

    Turning point for U.S.-China ties in 2026?

    Justin M. LarsonJanuary 7, 20260

    This report is from this week’s CNBC’s The China Connection newsletter, which brings you insights…

    Is Britain back? Five things to watch for the U.K. in 2026

    January 7, 2026

    Russia sends a naval vessel to escort an oil tanker the U.S. is pursuing.

    January 7, 2026
    Top Trending

    Turning point for U.S.-China ties in 2026?

    Justin M. LarsonJanuary 7, 20260

    This report is from this week’s CNBC’s The China Connection newsletter, which…

    Is Britain back? Five things to watch for the U.K. in 2026

    Justin M. LarsonJanuary 7, 20260

    This report is from this week’s CNBC’s UK Exchange newsletter. Like what…

    Russia sends a naval vessel to escort an oil tanker the U.S. is pursuing.

    Justin M. LarsonJanuary 7, 20260

    The development deepens the confrontation over the tanker formerly known as the…

    Subscribe to News

    Get the latest sports news from NewsSite about world, sports and politics.

    Advertisement
    Demo
    Editors Picks

    Review: Record Shares of Voters Turned Out for 2020 election

    January 11, 2021

    EU: ‘Addiction’ to Social Media Causing Conspiracy Theories

    January 11, 2021

    World’s Most Advanced Oil Rig Commissioned at ONGC Well

    January 11, 2021

    Melbourne: All Refugees Held in Hotel Detention to be Released

    January 11, 2021
    Latest Posts

    Queen Elizabeth the Last! Monarchy Faces Fresh Demand to be Axed

    January 20, 2021

    Review: Russia’s Putin Sets Out Conditions for Peace Talks with Ukraine

    January 20, 2021

    Review: Implications of San Francisco Govts’ Green-Light Nation’s First City-Run Public Bank

    January 20, 2021
    Advertisement
    Demo
    Editors Picks

    Turning point for U.S.-China ties in 2026?

    January 7, 2026

    Is Britain back? Five things to watch for the U.K. in 2026

    January 7, 2026

    Russia sends a naval vessel to escort an oil tanker the U.S. is pursuing.

    January 7, 2026

    China Touts Hainan, Its Duty-Free Island, Amid $1 Trillion Trade Surplus

    January 7, 2026
    Latest Posts

    Queen Elizabeth the Last! Monarchy Faces Fresh Demand to be Axed

    January 20, 2021

    Review: Russia’s Putin Sets Out Conditions for Peace Talks with Ukraine

    January 20, 2021

    Review: Implications of San Francisco Govts’ Green-Light Nation’s First City-Run Public Bank

    January 20, 2021
    Advertisement
    Demo
    Facebook X (Twitter) Pinterest Vimeo WhatsApp TikTok Instagram

    News

    • World
    • US Politics
    • EU Politics
    • Business
    • Opinions
    • Connections
    • Science

    Company

    • Information
    • Advertising
    • Classified Ads
    • Contact Info
    • Do Not Sell Data
    • GDPR Policy
    • Media Kits

    Services

    • Subscriptions
    • Customer Support
    • Bulk Packages
    • Newsletters
    • Sponsored News
    • Work With Us

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    © 2026 The Politics Designed by The Politics.
    • Privacy Policy
    • Terms
    • Accessibility

    Type above and press Enter to search. Press Esc to cancel.