“I started wondering if we could use the ChatGPT API to build an AI therapist and fine-tune it to meet the therapist’s specifications,” she said. “Increase access to treatment by providing free and confidential treatment, providing AI rather than humans, and removing the stigma around getting help from people who don’t want to talk to humans.”
In theory, AI could be used to address the growing need for mental health options and the shortage of mental health professionals to meet those needs. “Accessibility is simply a matter of supply and demand mismatch,” Iyer told BuzzFeed News. “Technically, the supply of AI could be endless.”
A 2021 study published in the journal SSM Population Health included 50,103 adults, 95.6% reported at least one barrier For medical care such as insolvency. People with mental health problems appear to be particularly affected by barriers to health care, including: price, Lack of expertise, stigma.
A 2017 study found that people of color were particularly Susceptible to health care failure as a result Racial and Ethnic Disparities, including high level mental health stigmalanguage barriers, discrimination and lack of health insurance coverage.
One of the benefits of AI is that we can transform programs into: 95 languages in seconds.
“Em’s users are all over the world. ChatGPT is translated into multiple languages, so I noticed that some people use their native language to communicate with Em. This is very convenient.” says Brendle.
Another advantage, says Brendle, is that while AI can’t provide true emotional empathy, it also can’t judge people.
“In my experience, AI tends to be non-judgmental, which opens the philosophical door to the complexities of human nature,” said Brendle. “Therapists don’t seem to be judgmental, but as humans we tend to be anyway.”
When AI shouldn’t be an option
But mental health experts say AI could do more harm than good for those looking for more information, needing medication options, or those in crisis. I warn you there is.
“Predictably controlling these AI models is something we’re still working on, and we don’t know in what unintended ways AI systems can make catastrophic mistakes,” says Iyer. said Mr. “Since these systems do not judge truth or goodness or badness, they only report what they have read before, it is difficult for AI systems to read inappropriate and harmful content and ask for help for that harmful content. There is a good chance it will be repeated to people who are in. It is too early to fully understand the risks here.”
The people at TikTok also say they need to make adjustments to their online tools. AI chat, for example, can provide more helpful feedback in response, they say.
“ChatGPT is reluctant to give definitive answers or make judgments about situations that human therapists can provide,” said Lum. “Additionally, ChatGPT lacks some ability to provide new perspectives on situations that users may have overlooked before a human therapist could see them.”
Some psychiatrists find ChatGPT to be a useful way to learn more about medicine, but it should not be the only step in treatment.
“It might be best to consider asking ChatGPT about drugs, like you would look up information on Wikipedia,” Torous said. “Finding the right medication is all about matching it to your needs and your body, and neither Wikipedia nor ChatGPT can do that right now. But you can learn more about common medications.” so you can make more informed decisions later.”
There are other options including calling 988, Free Crisis HotlineThe Crisis Hotline has phone and message options available for those who cannot find mental health resources in their community or who do not have the financial means to meet in person. moreover, Trevor Project Hotline, SAMHSA National Helplineand others.
“We have very good and accessible resources, such as calling 988 for help, which are good options in times of crisis,” says Torous. “We do not recommend using these chatbots during a crisis, as we do not want to rely on untested ones that are not designed to help when help is most needed.”
Mental health professionals we spoke to said AI therapy could be a useful tool for venting emotions, but until further improvements are made, it cannot outperform human experts. rice field.
“Currently, programs like ChatGPT are not a viable option for people looking for free therapy. They can provide some basic support, which is great, but clinically It’s not just support,” Torous said. “Even the creators of ChatGPT and related programs have made it very clear that they are no longer using them for therapy.”
Dial 988 in the US to National Suicide Prevention Lifeline. trevor projectTo provide support and suicide prevention resources for LGBTQ youth, call 1-866-488-7386. Befrienders Worldwide (befrienders.org).