From TikTok to ChatGPT: The Hidden Risks of Self-Diagnosis and AI in Youth Mental Health

Made with Canva (Free)

Recognition can be comforting, but also misleading.

From skincare tips to self-diagnosing ADHD in under a minute: social media has a video for almost everything. But when recognition turns into a label you give yourself, the consequences can be far from harmless. And this is also the case with using AI for your mental problems.

Influencers (mis)Information

Social media is not just for fun pictures and juicy stories.
We also use it to gain knowledge. From skincare tips to self-diagnosing ADHD (Attention Deficit Hyperactivity Disorder).

And when social media isn’t enough, young people turn to AI chatbots like ChatGPT for more private talk.

Self-diagnosis trend

While influencers promote, for example, beauty products, there’s another concerning trend lately: self-diagnosis of mental health conditions.

ADHD is a prime example.

The story is always the same:
An “experience expert” looks into the camera and says with conviction:
“If you have these 5 signs: from impulsive behavior to getting easily distracted: then you might also have… ADHD.”

Mind you, I think it is very good that people share from their own experience what they struggle with, and how their diagnosis can help others recognize themselves in it.

It’s also positive to see more people with lived experience getting involved in healthcare.

But this becomes problematic. Young people who are already struggling mentally regularly diagnose themselves based on what influencers say on social media.

This becomes a form of self-stigmatization.
Because once you have given yourself such a label, you start seeing everything you feel and do as a symptom of it.
You view yourself constantly through that ADHD lens.

And all the while, you do not even know for sure if you have ADHD!
Many of the symptoms you think you recognize could also be caused by stress or fatigue.

A vulnerable age group

Research suggests that the strongest associations between social media use and mental health issues are found among young adults aged 18–24, although effects are also visible in the broader 18–34 age group.

Young people do say they find support on social media.
But I think it is more about feeling less alone in what they feel than about actually finding reliable information on mental health issues.

For this age group, distinguishing between what is truly helpful and what is not is regularly not easy.
That is why I hope all those influencers think carefully about what they are doing. And about the consequences it can have.
Such as not seeking professional help, even though it is needed.

ChatGPT

But if you want to discuss your mental struggles, you often require a more intimate setting. That is why more young people take the step from social media to AI chatbots like ChatGPT.

But they can also send you down the wrong path.

You might end up with a mental disorder planted in your mind.
And lose even more self-confidence because you think something is seriously wrong with you.

And no matter how convincing ChatGPT sounds, only a professional can tell you what is really going on.

When professional help is out of reach
Although I strongly advise young people with mental health issues to seek help from a professional rather than from TikTok or ChatGPT, I also have to admit that this is easier said than done.
In many countries — including the Netherlands, the U.S., Taiwan, and China — access to professional help is limited, often very expensive, or comes with long waiting lists.
As a result, many young people turn not only to influencers but also to ChatGPT.

Young people themselves regularly say they benefit from this: they feel heard and seen, ChatGPT is always available, and it never judges.
This sounds almost too good to be true — and it is.

Some psychiatrists report concerning cases…
Psychiatrist Dr. Keith Sakata (UCSF) told Business Insider that he has already treated 12 young patients this year with symptoms resembling psychosis after long periods of isolation and intense interaction with chatbots like ChatGPT.
Their symptoms included delusions, disorganized thinking, and even hallucinations. While chatbots may not directly trigger psychosis, they possibly contribute to it.

Another problem is that ChatGPT can also give harmful advice.
That is why it remains important to stress that turning to AI chatbots for help on your own can be dangerous, especially if you are in a vulnerable phase of life.

Not all AI is the same

Not every AI tool is the same. ChatGPT is a general chatbot, but there are also apps built specifically for mental health. Unlike ChatGPT, which isn’t designed for that purpose, Wysa was created with mental health support in mind.

In 2022, Wysa received a Breakthrough Device designation from the U.S. Food and Drug Administration (FDA) after independent trials showed it could help people with chronic pain and the anxiety or depression often associated with it. It has been tested in research, used in academic studies, and receives strong user ratings, particularly for its privacy measures.

But Wysa is not perfect. Researchers note that more independent trials are needed, and, of course, no chatbot can replace a real therapist. Compared with the many unregulated apps on the market, Wysa appears to be a safer, more evidence-based tool — as long as it is used as a supplement, not a substitute, for professional care.

Back in March, psychiatrist Meadi told the Dutch newspaper Trouw that he doesn’t believe AI chatbots are ready yet to be taken seriously as medical tools. (The EU doesn’t even allow their use for this purpose yet.) Sure, digital tools can offer some temporary support and make young people feel heard, but they’re no match for trained professionals.

If you’re a young person dealing with mental health struggles, getting professional help is still your best bet. AI might have a place in your support network, but don’t make it your only lifeline for guidance or treatment.

Thanks for reading.

Leave a Reply