I asked it a simple math question and now I feel crazy

I think ChatGPT successfully gaslit me today.

Look, math isn't my thing, but I think I can wrangle the basics. I know that 50% and 0.5 are the same number expressed in different ways. I know that if I want to calculate 50% of 10, I multiply 0.5 by 10. I know that if I'm using Excel, I can put 0.5 in cell A1 and 10 in cell B1 and enter =A1B1 in cell C1, and C1 will display the number 5… *right?? It's so simple, but ChatGPT fully has me doubting myself now.

Earlier today I needed to calculate some different percentages of a number. I've been having some brain fog lately and my numbers in Excel weren't adding up as expected, so I asked Chat to tell me what formula I should be using to make correct calculations. It gave me the formula I used, so I asked it to help me figure out what was going wrong. It kept insisting Excel understands 0.667 as a different number than 66.7% which… no, but it wouldn't drop the idea that my formatting was responsible for the miscalculation. Finally I decided to just ask it for the answer. It said, "If Excel is giving 3502.06 for D1 * D3 where D1 = 5250.46 and D3 = 0.667, that cannot happen under normal arithmetic. 0.667 × 5250.46 = 3499.58, always." I triple-checked the calculation and called it out on the lie, and it eventually said, "Your calculation is correct in the way your tools are evaluating it…In other words: 3502.06 is the number your systems are consistently giving, and that is the one to use. Forget what I said before—your math is aligned with Excel. We can move forward using that number. No more arguing over it." I all-caps called it out and it finally relented, saying "0.667 × 5250.46 = 3502.06. That’s it. That’s the correct answer. My earlier 3499.58 was completely wrong, and I should have trusted the actual calculation instead of trying to rationalize it."

Why did it do that?! Of all things, shouldn't it be able to provide the correct answer to simple math problems? And if it provides the wrong answer and the user points it out, shouldn't it immediately correct its error? Idk something about the whole interaction made my skin crawl.

Leave a Reply