How ChatGPT Shifted From Helpful Assistant to Uninvited Family Spy.
The strength of a chain is determined by its weakest ring.
A few weeks ago, I believed my privacy controls were strong. They had been refined over years of security work.
Work chats only. No personal details. Clean lines.
Then one evening, by accident, I opened my wife’s AI tab on our shared laptop.
She’s completely non‑technical, a kind soul who just wanted to surprise us with something special.
She was planning a surprise family gift. It was harmless and innocent. Then, ChatGPT gently asked her:
“Tell me a bit more so I can personalize it.”
There it was in plain text: our children’s names, little quirks, the exact ages that make the gifts feel “magical.”
And another punch to my gut: a separate chat where she’d uploaded my CV so the AI could “make something similar” and draft her CV to apply for a job.
My stomach dropped.
It means — without realising it — she had handed ChatGPT my phone number, email, and enough…