Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

justaprogressive

(3,213 posts)
Tue Apr 1, 2025, 10:30 AM Tuesday

Anyone who trusts an AI therapist needs their head examined - Cory Doctorow



There's a debate to be had about whether AI chatbots make good psychotherapists. This is not an area of my expertise, so I'm not going to weigh in on that debate. But nevertheless, I think that if you use an AI therapist, you need your head examined:

https://www.salon.com/2025/03/30/some-argue-ai-therapy-can-break-down-mental-health-stigma–others-warn-it-could-make-it-worse/

I'm not an expert on psychotherapy, but I am an expert on privacy and corporate misconduct, and holy shit is the idea of a chatbot psychotherapist running on some Big Tech cloud a terrible idea. Because while I'm no expert on therapy, I have benefited from therapy, and I know this for certain: therapy requires confidentiality.

Shrinks are incredibly careful about privacy. For example: when my brother was getting married, my therapist was invited to the wedding. His daughter and my brother's fiancee were close friends, and my brother's fiancee had grown up staying over at their house and wanted her friend and her friend's parents at the wedding. My therapist sat me down and said, "Now listen, I take confidentiality very seriously. If you want me to, I will pretend not to know you at the wedding. No one needs to know that you're seeing me or – any therapist."

I told him I didn't mind people knowing I'd seen him, but just that little fastidious gesture confirmed the trust I'd put in Alan. It meant that I could openly and freely discuss things I'd never told anyone before, and that I never told anyone ever again. Having those genuinely open conversations transformed my life, for the better.

Now consider the chatbot therapist: what are its privacy safeguards? Well, the companies may make some promises about what they will and won't do with the transcripts of your AI sessions, but they are lying. Of course they're lying! AI companies lie about what their technology can do (of course). They lie about what their technologies will do. They lie about money. But most of all, they lie about data.

There is no subject on which AI companies have been more consistently, flagrantly, grotesquely dishonest than training data. When it comes to getting more data, AI companies will lie, cheat and steal in ways that would seem hacky if you wrote them into fiction, like they were pulp-novel dope fiends:

https://arstechnica.com/ai/2025/03/devs-say-ai-crawlers-dominate-traffic-forcing-blocks-on-entire-countries/

When an AI company tells you it won't use your intimate secrets as training data, they are lying. Of course they're lying! This isn't just any data, it's data that isn't replicated elsewhere on the internet. It's rare – it's unique. It's a competitive advantage. AI companies will 100%, without exception, totally use your private therapy data as training data.

What's more: they will leak your therapy sessions. They will leak them because they can't figure out how to prevent models from vomiting up their training data verbatim:

https://www.theatlantic.com/technology/archive/2024/01/chatgpt-memorization-lawsuit/677099/

But they'll also leak because tech companies leak like hell. They are crawling with insider threats. If the AI company sticks around long enough, it'll leak your secrets. And if it goes bankrupt? That's even worse! When tech companies go bust, the first thing their creditors do is sell off their warehouses full of private data. The more private and compromising that data is, the harder they'll try to sell it:

https://www.eff.org/deeplinks/2025/03/how-delete-your-23andme-data

Now, maybe you're thinking, "OK, but that's a small price to pay if we can finally get therapy for everyone." After all, the country – the world – is in the midst of a terrible mental health crisis and there's a dire shortage of therapists.

Now, let's stipulate for the moment to the idea that chatbots are substitutes for human therapists – that, at the very least, they're better than nothing. I don't think that's true, but let's say it is. Even so, this is a bad tradeoff.

Here, try this thought-experiment: someone figures out a great business-model for to pay for therapy for poor people. "We turned therapy into a livestreamed reality TV show. If you're too poor to afford a therapist, you can go to one of our partially trained livestreamer therapists, who will broadcast all of your secrets to anyone who watches. There's a permanent archive of these sessions, and the worst people in the world comb through it 24/7 looking for embarrassing stuff to repost and go viral with. What, you don't like that? Oh, I see: you just don't think poor people deserve mental health. I guess the perfect really is the enemy of the good."


https://pluralistic.net/2025/04/01/doctor-robo-blabbermouth/#fool-me-once-etc-etc
7 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Anyone who trusts an AI therapist needs their head examined - Cory Doctorow (Original Post) justaprogressive Tuesday OP
No joke! Jirel Tuesday #1
Easy to just hijack a person's brain get the red out Tuesday #2
If privacy is a concern, host your own LLM Model... JCMach1 Tuesday #3
As a retired therapist, I'll say an important piece is missing. nolabear Tuesday #4
Empathy is the missing part. patphil Tuesday #6
Absolutely. nolabear Tuesday #7
I seem to remember the use of AI for therapy was in Neuromancer, a Sci Fi book by William Gibson. patphil Tuesday #5

Jirel

(2,263 posts)
1. No joke!
Tue Apr 1, 2025, 10:35 AM
Tuesday

Using AI for this is obscene. There is no connection, no concern, no way to guard against bizarre interactions that may lead to harm, no privacy. “Here, why don’t you train our AI with your most heartbreaking secrets? It’ll be fun!”

get the red out

(13,705 posts)
2. Easy to just hijack a person's brain
Tue Apr 1, 2025, 10:41 AM
Tuesday

Form them into the personalities the AI programmers have been paid to create.

JCMach1

(28,545 posts)
3. If privacy is a concern, host your own LLM Model...
Tue Apr 1, 2025, 10:50 AM
Tuesday

On your own system with Ollama.

I get more than a little annoyed with broad proclamations by reporters with little understanding of what they are writing about.

Note, i am not saying i would ever, or ever recommend LLM's as therapy. Personally, i wouldn't for myself.

I just said that, but could something as simple as an LLM and some simple exercises have helped me get over grief? Probably. I just kept everything internally.

Having said that as well, is simple talk therapy effective?... Can be helpful.
Have I seen where LLM therapy has worked for people? Yes.

Conflicted...yep.

LLM's can do a lot of stuff.

nolabear

(43,649 posts)
4. As a retired therapist, I'll say an important piece is missing.
Tue Apr 1, 2025, 11:10 AM
Tuesday

Therapy isn’t just reciting your problems and being given advice. It’s forming a trusting relationship with another human being, dealing with all those issues of fear and trust and anger and shame and a host of other building blocks that make you who you are, deeply exploring them with another human being and not dying, or killing them, or being encouraged to avoid pain with things that cause more problems than they solve. It’s saying whatever you need to say without repercussion, or having it do harm, or having it spread around. When done well it’s deeply moving to both parties, which in itself restores hope and lends strength. Fuck AI therapy. Next somebody will claim it can parent as well as people can—already a creepy sci-fi concept. We don’t need more separation from one another.

patphil

(7,602 posts)
5. I seem to remember the use of AI for therapy was in Neuromancer, a Sci Fi book by William Gibson.
Tue Apr 1, 2025, 11:13 AM
Tuesday

The book came out in 1984, and won several awards.
Can you believe it...1984!

https://en.wikipedia.org/wiki/Neuromancer

Latest Discussions»General Discussion»Anyone who trusts an AI t...