Don’t Tell ChatGPT Your Secrets: Stanford’s Scary Warning
📌 Read free
Chatting My Heart Out (And Low-Key Freaking Out)
You ever catch yourself trauma-dumping on ChatGPT?
Like you open it to find a good pizza spot, and somehow three messages later you’re crying about your breakup and that Tinder date who collected Funko Pops.
Yeah. Same
What’s wild is how calm it stays through it all. No “oh god” face. No bad advice that makes you regret opening up. Just that blandly supportive, “Maybe talk to a therapist?” or “Have you tried listening to Taylor Swift?” energy.
For a brief, delusional second, it kinda feels like the ideal friend — patient, chill, always awake at 2 a.m.
But then you remember, it’s not a person. It’s a bunch of math pretending to care. Like talking to a mirror that occasionally types back.
Turns out, I might’ve been talking to a digital diary with eyeballs.
Cue the spooky music: Stanford just confirmed my worst fear. Researchers found that our chatbots — ChatGPT, Gemini, all of them; might actually be remembering what we tell them.
Stanford privacy expert Jennifer King even warns: “If you share sensitive information in a dialogue with ChatGPT, Gemini, or other…
