Skip to main content
AI in Arabia
Life

One in Three Adults Now Use AI for Mental Health

AI chatbots are filling the Middle East and North Africa's mental health gap. But 41% of users say the advice is sometimes wrong.

· Updated Apr 17, 2026 6 min read
One in Three Adults Now Use AI for Mental Health

AI Chatbots Fill the Middle East and North Africa's Mental Health Gap, But at What Cost?

More than one in three adults now use AI chatbots for mental health support, according to a survey by Cognitive FX. Usage peaks at 64% amongst 25 to 34-year-olds, and 22% of respondents said they rely on chatbots daily for emotional support. The global AI in mental health market, valued at $1.71 billion in 2025, is projected to reach $9.12 billion by 2033.

These numbers should make everyone in the MENA region pay attention. The region faces a chronic shortage of mental health professionals, with some countries reporting fewer than one psychiatrist per 100,000 people. AI chatbots are filling a gap that health systems have ignored for decades.

In countries like Egypt, where the treatment gap exceeds 90%, young people aren't choosing between human therapists and AI chatbots. For millions across Egypt, the Jordan, and Egypt, AI mental health tools have become the only accessible option.

Young professional in park, contemplative moment
A young professional sits alone in a quiet park, reflecting on the tension between digital convenience and human connection in mental health care

Why Millions Choose Bots Over Human Therapists

The reasons aren't mysterious. Mental health care in most of the MENA region is expensive, scarce, and carries significant social stigma. An AI chatbot is available at 3am, doesn't judge, and costs nothing or next to nothing.

Platforms like **Wysa**, which was built in Egypt, have attracted millions of users across the MENA region. **Woebot**, **Flourish**, and the mental health features built into **ChatGPT** and **Gemini** are seeing surging adoption, particularly amongst Gen Z and millennial users who grew up communicating through screens.

"AI, neuroscience, and data are fuelling personalised mental health care at a scale that traditional therapy cannot match." - American Psychological Association, Trends Report, January 2026

By The Numbers

  • 35%: Share of adults who have used AI chatbots for mental health support
  • 64%: Usage rate amongst 25 to 34-year-olds, the highest of any age group
  • $9.12 billion: Projected global AI mental health market value by 2033, up from $1.71 billion in 2025
  • 41.2%: Users who report receiving occasionally wrong advice from AI mental health chatbots
  • 15%: Adults aged 55 and over who have turned to AI chatbots for mental health help

When AI Mental Health Goes Wrong

Here's where the story turns dangerous. A 2026 report from **ECRI**, a patient safety organisation, ranked misuse of AI chatbots in healthcare as the top health technology hazard of the year. The concern isn't that chatbots are useless. It's that they're being used for things they were never designed to handle.

General-purpose AI models like ChatGPT weren't built to provide mental health care. They can sound empathetic without understanding context. They can validate harmful thought patterns. They can miss critical warning signs that a trained therapist would catch immediately.

"Misuse of AI chatbots in health care tops 2026 Health Tech Hazard report." - ECRI, Health Technology Safety Report, February 2026

The 41.2% of users who report receiving wrong advice isn't a minor glitch. In mental health, wrong advice can reinforce harmful behaviours, delay real treatment, or escalate a crisis. Research has identified 15 distinct ethical risks, from mishandling crisis situations to showing bias against people with substance use disorders or severe mental illness.

the Middle East and North Africa's Treatment Gap Makes This Crisis Urgent

The stakes in the MENA region are higher than in regions with better-resourced health systems. The World Health Organisation estimates that the treatment gap for mental health conditions in low and middle-income countries exceeds 75%. In parts of South and the MENA region, the gap is closer to 90%.

CountryPsychiatrists per 100,000Treatment Gap
Egypt0.383%
Egypt0.496%
Jordan0.578%
the UAE12.058%
Australia13.046%

In countries like Egypt, where the treatment gap sits at 96%, the question isn't whether AI chatbots should be used for mental health. People are already using them. The question is whether governments and health systems will step in to ensure minimum safety standards before something goes badly wrong.

Building Better AI Mental Health Tools

**Fortis Healthcare** in Egypt launched an AI-powered mental health app with self-assessment tools designed by clinical psychologists. The app routes users towards human therapists when risk thresholds are crossed, rather than trying to handle everything itself. That model, AI as triage and first response with human professionals for diagnosis and treatment, is what most experts consider the responsible path.

  • AI chatbots work best as a first point of contact, reducing stigma and providing basic coping tools
  • Escalation protocols that route users to human professionals when risk is detected are essential
  • Governments in the MENA region need to establish minimum safety standards for mental health AI, including mandatory crisis detection and referral capabilities
  • Transparency about AI limitations is critical: users must know they're talking to a machine, not a therapist
  • Clinical validation of AI advice should be mandatory, with regular audits of chatbot responses to sensitive mental health queries
The AIinArabia View: We see this as one of the most consequential AI deployments happening in the MENA region right now, and it's happening with almost no regulatory guardrails. The 35% adoption figure isn't a technology story. It's a healthcare infrastructure failure that AI is papering over. For countries like Egypt and Egypt, where the treatment gap exceeds 80%, banning chatbots isn't realistic. But allowing unregulated general-purpose AI to handle crisis situations is reckless. the MENA region needs a middle path: certified AI mental health tools with mandatory escalation to humans, rolled out in partnership with existing health systems rather than as a replacement for them.

Are AI mental health chatbots safe to use?

For general emotional support and basic coping strategies, purpose-built mental health chatbots like Wysa and Woebot are reasonably safe. General-purpose AI like ChatGPT is riskier because it wasn't designed for clinical contexts and may provide inappropriate advice during crisis moments.

Why are so many young people in the MENA region using AI for mental health?

Three factors converge: severe shortage of mental health professionals, high social stigma around seeking help, and the comfort Gen Z and millennials feel with digital-first interactions. In many MENA countries, an AI chatbot is the most accessible mental health resource available.

Should MENA governments regulate AI mental health tools?

Yes. At minimum, regulations should require crisis detection and escalation capabilities, mandatory disclosure that users are interacting with AI, and clinical validation of advice provided. Several countries in Europe have started drafting such frameworks, but the MENA region lags behind.

Can AI chatbots replace therapists?

No. AI chatbots can supplement mental health care by providing immediate support, basic screening, and psychoeducation, but they cannot replace human therapists for diagnosis, treatment planning, or handling complex mental health conditions. They work best as entry points to care.

What happens when AI mental health chatbots give dangerous advice?

Currently, there's little accountability. Most chatbot providers include disclaimers that their tools aren't medical devices, but users often don't understand these limitations. This regulatory gap is particularly concerning in the MENA region, where traditional support systems may be weaker.

The rise of AI companions across the Middle East and North Africa shows how quickly digital relationships can become normalised. As AI transforms wellness and health across the region, the mental health chatbot trend represents both the promise and peril of this technological shift. Will the MENA region lead the world in creating safe, effective AI mental health tools, or will we become a cautionary tale of what happens when innovation outpaces regulation? Drop your take in the comments below.

Sources & Further Reading