Tuesday, 02 January 2024 12:17 GMT

No, Chatgpt Can't Be Your Mental Health Counsellor


(MENAFN- Khaleej Times)

As people across the world, especially in the UAE, turn increasingly to artificial intelligence (AI) as a convenient tool to navigate life, experts are becoming concerned that it is being used to work through some of their biggest emotional challenges in lieu of a seeking a professional therapist.

Sreevidhya Kottarapat Srinivas, Clinical Psychologist at Medcare Camali Clinic, told wknd. that the growing dependence on ChatGPT and other AI tools for mental health guidance reflects a larger shift in the ways in which people are seeking help.

Recommended For You 'Wounds are open': Top UAE diplomat calls for dialogue to reach peace based on justice

“It is like a 'quick fix', often providing immediate solutions. [Plus, it's] easily accessible and anonymous,” she said.

The trend is seen more in the younger generation, for whom AI is becoming the first point of contact to explore emotions or understand symptoms before reaching out to a professional.“While this trend offers potential for early psychoeducation and de-stigmatising of mental health concerns, it should not be a substitute for qualified mental health professionals,” she explained.

Last year, a study by the Oliver Wyman Forum found that 36 per cent of Gen-Z and millennials would consider using AI for mental health support, while only 27 per cent of other generations would. The move has been sparked by an uptick in mental health issues and awareness even as the stigma slowly lifts.

Since the pandemic, there has been a 25-27 per cent rise in depression and anxiety, according to the World Health Organisation. And about half of the world's population is expected to experience a mental health disorder during their lifetime, according to researchers at Harvard Medical School and the University of Queensland.

Srinivas said concern arises when individuals begin to over-rely on AI responses, especially in complex or high-risk situations that require more hands-on solutions and the involvement of another human being with skills such as empathy, diagnostic clarity, and real-time crisis management.“ChatGPT doesn't know the full context, and often emotions such as pain, trauma, and anger may not be well communicated over text. AI has its limitations in forming a therapeutic alliance, and offers limited accountability. So, while the advice may seem helpful on the face of it, it can undermine or miss the signs of underlying trauma, nuanced behaviour, or even reinforce cognitive distortions,” she said.

Dr Rebecca Steingiesser, a consultant clinical psychologist and clinical neuropsychologist based in Dubai, said the issue is becoming more prevalent.“I'm hearing a lot about this now in my practice, with my clients using AI to help themselves organise their goals and make important life decisions, for example,” she said.

“It's obvious that AI is already beginning to reshape the landscape of therapy and mental health support. We are seeing the emergence of AI-powered tools offering psychoeducation, symptom checkers, journaling prompts, mood tracking, and even basic cognitive-behavioural interventions. These are what I would normally share with clients in sessions on paper forms,” she added.

She said while these tools can be helpful adjuncts to therapy, particularly for monitoring progress between sessions or providing immediate, low-intensity in-the-moment support, they are not substitutes for the nuanced, relational, and highly individualised work that occurs in therapy.

“I've also seen individuals use it for exploring whether their experiences might be consistent with certain diagnoses, though that comes with serious risks, especially if they are making decisions about medications based on this information without consulting with their psychiatrists,” she added.

Devika Mankani, psychologist at The Hundred Wellness Centre Dubai, who has 25 years' experience, has seen the consequences of using AI in patients who came after using it before turning to a professional.

“I've seen clients come into therapy after months of relying on AI tools. In one case, a woman believed she was in a 'toxic' marriage because ChatGPT repeatedly affirmed her frustrations without context or challenge. She later admitted what she needed was a safe space to explore her own patterns, not to be nudged toward an exit,” she said.“In another case, a man with narcissistic traits used AI to validate his belief that others were always at fault, reinforcing his grandiosity and avoiding self-reflection.”

She says that while the interaction may feel therapeutic at the time, it is not always rooted in clinical assessment, supervision, or accountability.

Srinivas explained that AI models are trained on generalised data and cannot replace clinical judgment.“There is also a risk of emotional dependence on a system that cannot provide attuned human responsiveness, which is a key part of therapeutic healing,” she warned.

She too has seen the cases first hand with concerning consequences for those depending on the technology that has taken the world by storm.

“I've had clients mention they consulted ChatGPT about whether their partner had a narcissistic personality or whether they were struggling with Attention deficit hyperactivity disorder (ADHD), often based on a list of traits and no proper assessment. In one case, a client who was a child completely withdrew from social interactions in the real world and would often communicate her thoughts and feelings through the app. When asked, she said:“ChatGPT is my only friend.” This was a case of AI unintentionally validating a skewed narrative because of a lack of therapeutic insight.

The stigma of seeking therapy also remains a deterrent, globally. According to a 2022 study in Springer Nature, 'Attitudes towards mental health problems in a sample of United Arab Emirates' residents', researchers said:“Mental health issues are still stigmatised in the United Arab Emirates (UAE), possibly due to cultural reasons.”

This attitude has of course undergone a change since then, with the UAE government making strides in de-stigmatising mental healthcare.

Still, for some, it is easier to engage with AI than an actual person. Dr Steingiesser says not only is AI use more common among younger adults and teens already comfortable with digital platforms and more open to experimenting with technology, it is also turned to as it is seen as less intimidating or judgmental than a real-life therapist.

“That said, I'm also seeing an increase in busy professionals using AI for support in managing stress or burnout, particularly when access to therapy is delayed or limited due to long waitlists or challenges with a busy work schedule,” she added.

Context in using AI, she agrees is key. Listening to AI lacking critical pieces of a complex human puzzle, especially for people making major life decisions such as ending relationships, changing careers, or self-diagnosing mental health conditions, can be disastrous.

“It is also very clear that AI can't detect red flags for risk of harm, such as suicidal ideation, in the way a trained professional can, how a person presents in person, their energy, their demeanour. So many subtle indicators would never be picked up on,” she reasons while explaining why online therapy is unsuitable for high-risk clients.

Reading between the lines

“Misdiagnosis, minimisation of distress, or reinforcement of harmful thinking patterns are very real concerns, and I always caution my clients from putting too much emphasis on it,” she explained.

Alarmingly, this month, during the first episode of OpenAI's official podcast, OpenAI CEO Sam Altman said he is surprised at how much people rely on ChatGPT:“People have a very high degree of trust in ChatGPT, which is interesting, because AI hallucinates. It should be the tech that you don't trust that much.”

Srinivas said the issue sheds light on how challenging it is to access appropriate mental health services in many parts of the world.“Turning to AI is not always a matter of preference, it might also be the only option for some individuals who find it difficult to access or afford mental health services,” she said.

“Policymakers need to prioritise community mental health funding, make insurance coverage available, and make mental health a part of primary care. If this gap is not addressed in making services accessible and available, we are only increasing the risk where individuals have to resort to alternatives that are potentially harmful and inadequate.”

Mankani agrees:“This trend is not going away. Our responsibility is to build a future where technology supports humans flourishing, without replacing human care.”

MENAFN06072025000049011007ID1109766642



Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.

Search