The Therapy Equivalent of Uber

Several serious ethical crises emerging at online therapy companies like Talkspace and BetterHelp are gaining mainstream attention. Consumer beware: there are credible allegations of fake reviews on a grand scale (NYT), a CEO who openly advocates data mining of “confidential” (and undeletable) therapy transcripts (NYT), blatant HIPAA violations such as openly revealing patient emails (Forbes), and much more.

At Talkspace, Start-Up Culture Collides With Mental Health Concerns (NYT): The therapy-by-text company made burner phones available for fake reviews and doesn’t adequately respect client privacy, former employees say.

“We need data. All of our data. Mine and yours.” (NYT): Talkspace CEO pens op-ed supporting mining data from patients’ undeletable therapy transcripts.

Talkspace Reveals Clients’ Email, Violating Clinical Confidentiality (Forbes): “Talkspace explicitly defines itself as a “Platform.” It is a business with customers and not a healthcare provider with clients or patients. And it is as a business they promise both anonymity and confidentiality. They use the language of a clinical relationship. But that is not what Talkspace is. When Talkspace promises confidentiality it is done with all the limitations in trust inherent in any company’s marketplace promises. Talkspace defines itself and is only accountable as a business, not as a healthcare provider.”

BREAKDOWN: Inside the messy world of anonymous therapy app Talkspace (Verge): This lengthy expose discusses how mental health counselors are exploited via low pay, unpredictable pay, unmanageable hours, and being forced to violate professional ethics in service of business concerns. It also details patient rights violations: everything from leaky privacy, professionally irresponsible anonymity, predatory/bogus charges, patient abandonment, and overall failure to provide medical care while misleading patients.

YouTube’s BetterHelp mental health controversy, explained (Polygon): Not only do many YouTube’s celebrity creators have lucrative sponsorship deals with BetterHelp, YouTube itself has ties with BetterHelp and often displays ads for BetterHelp under the videos of any creator discussing mental health. Backing away from the sponsorship does mean a loss of impressive affiliate money for creators; that’s part of the issue. Most of the videos that have BetterHelp sponsorships revolve around a creator discussing their own issues. While these are valid, viewers have complained that it feels like profiteering off mental illness at best, and causing serious harm to mentally ill people at worst. (BetterHelp’s terms of service state that the company can’t guarantee a qualified professional. “We do not control the quality of the Counselor Services and we do not determine whether any Counselor is qualified to provide any specific service as well as whether a Counselor is categorized correctly or matched correctly to you.”)

Therapy app Talkspace accused of ethically questionable practices (MobiHealthNews): “The notion that the company can read the chats – and isn’t entirely clear about the circumstances in which it does – raises concerns about privacy and confidentiality and may have spurred the August HIPAA complaint by a therapist on the platform. It also threatens to undermine the trust relationship between patients and therapists. So does the notion that a therapist might be required to insert a script into a therapy session, promoting or advertising additional Talkspace services – a practice The Verge says Talkspace has engaged in.”

posted by MiraK (4 comments total) 4 users marked this as a favorite

As a therapist this is terrifying. As a patient this is even more terrifying. In sum, terrifying.
posted by AlexiaSky at 9:51 AM on August 9 [2 favorites]
This, in particular, blows my mind: “Users can’t delete their [therapy] transcripts, for example, because they are considered medical records.” I remember when I was in therapy, the psychologist who treated me put an enormous effort into writing up a medical record for me that divulged the bare minimum, in the vaguest terms possible, in order to protect my privacy, precisely because he was required by law to keep it on file for many years and possibly reveal it to courts if required to by a judge. So my record was full of session notes that read like: “Patient reports anxiety over a recent concerning event. We discussed relational and personal aspects of the event and connected it to childhood events. Psychodynamic and cognitive interventions were used.” And that’s it.

The idea that Talkspace stores THE ENTIRE TRANSCRIPT of every therapy session AS THE MEDICAL RECORD is … yeah, terrifying.
posted by MiraK at 9:55 AM on August 9 [4 favorites]

Sweet Jesus Fuck.
posted by seanmpuckett at 10:19 AM on August 9
The best way to protect data is to not collect it at all, and every tech system in wide use is designed to store every piece of data that passes into the system.

The prospect of texting therapy gave me the willies the first time I saw ads for it, and for similar non-therapy chatbots: who knows where that data is going?

I shouldn’t feel smug for being vindicated in every way, but I absolutely do. This sort of data collection and mining was what these sorts of chat systems were designed to do, using it for therapy is very nearly as bad an idea as electronic voting.

This isn’t to excuse anything that Talkspace has done. There’s ways to keep this sort of thing secure. But you have to actually want to. The default is to just collect everything.
posted by BungaDunga at 10:23 AM on August 9

tinyurlis.gdv.gdv.htu.nuclck.ruulvis.netshrtco.detny.im

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>