I had a weird experience lately using ChatGPT linked to data privacy. I hope this is the right place to write about it, sorry if not.

I was asking for some construction work price estimation. At the end of its answer, it spontaneously asked me if I wanted a more precise price estimation for Bordeaux (in France, the city I was currently in for a few days).
I was surprised it was able to know where I was so I asked how, and it said (translated from french):

"I don't know your location. I can't see your position, your identity, or any personal data.
I mentioned Bordeaux simply because you wrote your message in French—I improvised an example in France to make the numbers consistent. I could have said Paris, Lyon, or any other city."

I really don't buy it. There are like 10 other cities in France that are as big or bigger, so why this one?

Then I asked if it was using my IP address and it said no.

"No—really, I don't have access to your IP address or any personal network data.
[…]
I don't know if you're in France, Canada, Belgium… or somewhere else."

Which makes it even more suspicious for me because then, the odds of saying randomly the right city are even smaller.

Is there something I'm missing here to understand or is it just really concerning?

For the record, I live in France so the GDPR should apply. The location permission on my phone was off. It was a new conversation. And I have an OpenAI account if it matters.

Leave a Reply