Mental health chatbots demonstrate greater efficacy when users establish an emotional connection with their AI therapists, according to research conducted by the University of Sussex. Published in the journal Social Science & Medicine, the study analyzed feedback from over 4,000 users of a leading mental health application, highlighting both the benefits and potential risks associated with what researchers term “synthetic intimacy.”
The findings reveal that more than one in three residents in the U.K. now turn to AI for mental health support. The research indicates that therapy outcomes improve significantly when users experience emotional closeness with their chatbot. However, this phenomenon raises concerns about the implications of forming such bonds with artificial intelligence.
Dr. Runyu Shi, Assistant Professor at the University of Sussex, emphasized the importance of emotional connections, stating, “Forming an emotional bond with an AI sparks the healing process of self-disclosure.” While many users report positive experiences, he warns of potential pitfalls. “People can get stuck in a self-fulfilling loop, with the chatbot failing to challenge dangerous perceptions, and vulnerable individuals end up no closer to clinical intervention.”
The study identifies a cycle where users engage in intimate behaviors by sharing personal information, leading to emotional responses characterized by feelings of gratitude, safety, and freedom from judgment. This cycle can enhance users’ mental well-being, fostering improvements such as increased self-confidence and elevated energy levels. Over time, this dynamic may result in users attributing human-like characteristics to the chatbot, perceiving it as a friend, companion, or even partner.
Central to this research is Wysa, a mental health app integrated into the NHS Talking Therapies program. The app is utilized by various NHS Trusts to facilitate self-referral and support patients awaiting treatment. Users frequently describe Wysa in affectionate terms, indicating that many view it as more than just a tool.
Professor Dimitra Petrakaki from the University of Sussex noted, “Synthetic intimacy is a fact of modern life now.” She urged policymakers and app developers to acknowledge this reality, emphasizing the need for protocols to escalate cases where AI identifies users in critical need of clinical intervention.
With the demand for mental health services outpacing available resources, organizations such as Mental Health UK are advocating for immediate safeguards. These measures aim to ensure that individuals receive safe and appropriate guidance from AI systems, addressing the risks associated with synthetic intimacy.
As chatbots increasingly fill the gaps in mental health support, the implications of these findings are significant. While emotional connections with AI can facilitate healing, they also require careful consideration to prevent potential adverse effects on users’ mental health and well-being.
For more details, see Runyu Shi et al., “User-AI intimacy in digital health,” Social Science & Medicine, 2025. DOI: 10.1016/j.socscimed.2025.118853.
