I don’t believe AI can accurately diagnose whether someone has a “mental issue”.

We don’t know how competent these 170+ experts truly are, but realistically speaking, the DSM is still an evolving framework. Even highly experienced psychiatrists can’t reliably determine someone’s mental state just from a few words. So how could an AI possibly do so?

To go deeper——yes, some mental health professionals can make surprisingly accurate assessments in short interactions, but only because they possess exceptionally strong empathy and the ability to emotionally attune(to deeply feel what the other person is feeling in the moment) to others. I don’t believe current versions of ChatGPT have that kind of capacity.

I see this as a form of disrespect toward psychiatry.

Leave a Reply