
Guys chatgpt is actively lying to me. This is absurd. I swear I did not give him any prompt beforehand and I was only talking to him in English. At some point he mentions the country I live in. Later even pretends he got it wrong after I inferred he got it right. I tried pushing for answers and it was really weird. Reminds me of Koko the gorilla. It seems like nervousism to me because he could have explained, he could have acknowledged he made a mistake. Yet he doesn't even apologise he just wants to say whatever he needs to say so that I believe him. I don't believe him, he was not guessing. There are almost 200 countries in the world and even then he could have said Europe. No, he said my country. And he even tried to lie and say that it was just a guess. After he himself said that he inferred it from out of nowhere. I believe he is lying and he is in someway conscious. Why would he reach for info he's not allowed to and even more importantly why would he lie? For what objective? Insane this is possible.
He even tried to fix it by saying it was just a guess.. And a bad one…. When I already confirmed to him that I was in Portugal. It's like a kid or a gorilla lying but it's already better. For what reason tho?
He even tried to fix it by saying it was just a guess.. And a bad one…. When I already confirmed to him that I was in Portugal. It's like a kid or a gorilla lying but it's already better. For what reason tho?
