‘We haven’t figured that out yet’: Sam Altman explains why using ChatGPT as your therapist is still a privacy nightmare


  • OpenAI’s CEO says using ChatGPT for therapy has serious privacy risks
  • Your private chats might be exposed if OpenAI were to face a lawsuit
  • Feeding your private thoughts into an opaque AI is also a risky move

One of the upshots of having an artificial intelligence (AI) assistant like ChatGPT everywhere you go is that people start leaning on it for things it was never meant for. According to OpenAI CEO Sam Altman, that includes therapy and personal life advice – but it could lead to all manner of privacy problems in the future.

On a recent episode of the This Past Weekend w/ Theo Von podcast, Altman explained one major difference between speaking to a human therapist and using an AI for mental health support: “Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT.”

Leave a Comment