ChatGPT just totally making things up

Today I asked ChatGPT to summarize a file I uploaded. It gave me a long, weird summary about a woman in a kitchen inspecting a stain on the stove.

The thing is that the file had literally nothing to do with this. It was a file about student loan regulations. I told ChatGPT that the summary made no sense and then it admitted it couldn’t properly read the file so it told me what it assumed was in the file based on context of what kind of file it was. This seems like such a goofy response instead of just saying the file couldn’t be opened.

Leave a Reply