skip to main content

For Some Autistic People, ChatGPT Is a Lifeline

0

Fordham GSS Associate Professor Lauri Goldkind, Ph.D., was quoted in a recent WIRED article exploring ChatGPT’s usefulness for therapeutic services.

Using a chatbot to help with communication may seem unconventional, but it’s in line with some established ideas used in social work to help people become more independent. “We talk about empowering people and helping people to be fully autonomous and experience success on their own terms,” says Lauri Goldkind, a professor in Fordham University’s Graduate School of Social Service who focuses on the marriage of social work and technology. An accessible tool like a generative AI bot can often help bridge the gap left by intermittent access to mental health services like therapy, Goldkind says.

But the true impact of ChatGPT for therapeutic reasons is largely unknown. It’s too new—WIRED reached out to four clinical therapists and counselors for input. Each of them declined to comment, saying that they have yet to explore the use of ChatGPT as a therapeutic tool or encounter it in their sessions.

The chatbot’s flexibility also comes with some unaddressed problems. It can produce biased, unpredictable, and often fabricated answers, and is built in part on personal information scraped without permission, raising privacy concerns.

Goldkind advises that people turning to ChatGPT should be familiar with its terms of service, understand the basics of how it works (and how information shared in a chat may not stay private), and bear in mind its limitations, such as its tendency to fabricate information. Young said they have thought about turning on data privacy protections for ChatGPT, but also think their perspective as an autistic, trans, single parent could be beneficial data for the chatbot at large.

Read the full article here.

Share.

Comments are closed.