Bias towards choosing the second option?

I use ChatGPT a lot for my work — I’m a writer and editor, and sometimes I’m not sure which sentence would sound better so I plug my options into ChatGPT. For example:

“This is for an article I’m writing. Which statement is easier to read for [X audience]?

I ate a ham and cheese sandwich for breakfast.

VS

For breakfast, I had a ham and cheese sandwich.”

Without fail, ChatGPT ALWAYS chooses the 2nd option and lists a bunch of reasons why it’s better. And guess what? I started catching on, so I’ve opened up different chats and asked the same question, but swapped the exact same 1st sentence and 2nd sentence. It STILL chooses the 2nd option even though it’s the exact same, and it’s different from what the chat said the first time.

Has anyone else noticed this?? Why is ChatGPT so biased towards picking the 2nd option? It is very irritating. I thought I could trust it for a sorta objective read on what would be a better sentence. But alas

Leave a Reply