Title says it all. I just spent three days having ChatGPT walk me through what was supposed
to be a simple plumbing repair l, only to have it start giving me conflicting information. Every time I pointed it out, it would say, “oh yes, you are 100% correct to point that out,” then proceed to give me the opposite advice it had just given me. Finally I shut it down after realizing the instructions I was. Ring give. Weren’t trustworthy and were likely to cause me to do something I (and my wallet) was going to regret. Three days wasted. 😡
to be a simple plumbing repair l, only to have it start giving me conflicting information. Every time I pointed it out, it would say, “oh yes, you are 100% correct to point that out,” then proceed to give me the opposite advice it had just given me. Finally I shut it down after realizing the instructions I was. Ring give. Weren’t trustworthy and were likely to cause me to do something I (and my wallet) was going to regret. Three days wasted. 😡
This is not the first time I have had it speak with confidence and authority on something that it didn’t really know that much about. Has this happened to others?