TechByte: Teens Increasingly using AI as “therapist”
Mental health awareness is certainly an important topic, but it’s important you get help from a qualified source. On this week’s TechByte we give you an example of what not to do.
A recent report in The Journal of the American Medical Association found about 13-percent of young people now use AI chatbots for mental health advice.
And in a separate survey, 44-percent of people say they’d rather start discussions with a chatbot than family, friends, or even a doctor.
But while some users report positive experiences, experts say there are significant concerns with safety, privacy and ethical standards…as well as other limitations.
Dr. Sue Varma, a board-certified psychiatrist says, “It is overall very generic. It does not understand the context of your life, your lived experience, your unique situation and background — something that only a trained therapist through interaction, in real time, with somebody would say. You can tell so much about a person face to face.”
Mental health experts say AI can also miss emotional cues, reinforce harmful beliefs, or fail to safeguard users in crisis.
They recommend parents talk to their child’s pediatrician about any behavioral concerns and ask for a referral to a mental health professional, if needed.AI tends to be nothing but affirming and just tell you what you want to hear. It may sound nice on paper, but it’s far from beneficial to your mental health, especially for a young teenager.
It’s always good to remember that AI is a series of algorithms, not a real person. It’s great for some things, and really harmful for others.