General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsAnyone who trusts an AI therapist needs their head examined - Cory Doctorow

https://www.salon.com/2025/03/30/some-argue-ai-therapy-can-break-down-mental-health-stigmaothers-warn-it-could-make-it-worse/
I'm not an expert on psychotherapy, but I am an expert on privacy and corporate misconduct, and holy shit is the idea of a chatbot psychotherapist running on some Big Tech cloud a terrible idea. Because while I'm no expert on therapy, I have benefited from therapy, and I know this for certain: therapy requires confidentiality.
Shrinks are incredibly careful about privacy. For example: when my brother was getting married, my therapist was invited to the wedding. His daughter and my brother's fiancee were close friends, and my brother's fiancee had grown up staying over at their house and wanted her friend and her friend's parents at the wedding. My therapist sat me down and said, "Now listen, I take confidentiality very seriously. If you want me to, I will pretend not to know you at the wedding. No one needs to know that you're seeing me or any therapist."
I told him I didn't mind people knowing I'd seen him, but just that little fastidious gesture confirmed the trust I'd put in Alan. It meant that I could openly and freely discuss things I'd never told anyone before, and that I never told anyone ever again. Having those genuinely open conversations transformed my life, for the better.
Now consider the chatbot therapist: what are its privacy safeguards? Well, the companies may make some promises about what they will and won't do with the transcripts of your AI sessions, but they are lying. Of course they're lying! AI companies lie about what their technology can do (of course). They lie about what their technologies will do. They lie about money. But most of all, they lie about data.
There is no subject on which AI companies have been more consistently, flagrantly, grotesquely dishonest than training data. When it comes to getting more data, AI companies will lie, cheat and steal in ways that would seem hacky if you wrote them into fiction, like they were pulp-novel dope fiends:
https://arstechnica.com/ai/2025/03/devs-say-ai-crawlers-dominate-traffic-forcing-blocks-on-entire-countries/
When an AI company tells you it won't use your intimate secrets as training data, they are lying. Of course they're lying! This isn't just any data, it's data that isn't replicated elsewhere on the internet. It's rare it's unique. It's a competitive advantage. AI companies will 100%, without exception, totally use your private therapy data as training data.
What's more: they will leak your therapy sessions. They will leak them because they can't figure out how to prevent models from vomiting up their training data verbatim:
https://www.theatlantic.com/technology/archive/2024/01/chatgpt-memorization-lawsuit/677099/
But they'll also leak because tech companies leak like hell. They are crawling with insider threats. If the AI company sticks around long enough, it'll leak your secrets. And if it goes bankrupt? That's even worse! When tech companies go bust, the first thing their creditors do is sell off their warehouses full of private data. The more private and compromising that data is, the harder they'll try to sell it:
https://www.eff.org/deeplinks/2025/03/how-delete-your-23andme-data
Now, maybe you're thinking, "OK, but that's a small price to pay if we can finally get therapy for everyone." After all, the country the world is in the midst of a terrible mental health crisis and there's a dire shortage of therapists.
Now, let's stipulate for the moment to the idea that chatbots are substitutes for human therapists that, at the very least, they're better than nothing. I don't think that's true, but let's say it is. Even so, this is a bad tradeoff.
Here, try this thought-experiment: someone figures out a great business-model for to pay for therapy for poor people. "We turned therapy into a livestreamed reality TV show. If you're too poor to afford a therapist, you can go to one of our partially trained livestreamer therapists, who will broadcast all of your secrets to anyone who watches. There's a permanent archive of these sessions, and the worst people in the world comb through it 24/7 looking for embarrassing stuff to repost and go viral with. What, you don't like that? Oh, I see: you just don't think poor people deserve mental health. I guess the perfect really is the enemy of the good."
https://pluralistic.net/2025/04/01/doctor-robo-blabbermouth/#fool-me-once-etc-etc

Jirel
(2,263 posts)Using AI for this is obscene. There is no connection, no concern, no way to guard against bizarre interactions that may lead to harm, no privacy. Here, why dont you train our AI with your most heartbreaking secrets? Itll be fun!
get the red out
(13,705 posts)Form them into the personalities the AI programmers have been paid to create.
JCMach1
(28,545 posts)On your own system with Ollama.
I get more than a little annoyed with broad proclamations by reporters with little understanding of what they are writing about.
Note, i am not saying i would ever, or ever recommend LLM's as therapy. Personally, i wouldn't for myself.
I just said that, but could something as simple as an LLM and some simple exercises have helped me get over grief? Probably. I just kept everything internally.
Having said that as well, is simple talk therapy effective?... Can be helpful.
Have I seen where LLM therapy has worked for people? Yes.
Conflicted...yep.
LLM's can do a lot of stuff.
nolabear
(43,649 posts)Therapy isnt just reciting your problems and being given advice. Its forming a trusting relationship with another human being, dealing with all those issues of fear and trust and anger and shame and a host of other building blocks that make you who you are, deeply exploring them with another human being and not dying, or killing them, or being encouraged to avoid pain with things that cause more problems than they solve. Its saying whatever you need to say without repercussion, or having it do harm, or having it spread around. When done well its deeply moving to both parties, which in itself restores hope and lends strength. Fuck AI therapy. Next somebody will claim it can parent as well as people canalready a creepy sci-fi concept. We dont need more separation from one another.
patphil
(7,602 posts)nolabear
(43,649 posts)patphil
(7,602 posts)The book came out in 1984, and won several awards.
Can you believe it...1984!
https://en.wikipedia.org/wiki/Neuromancer