I am a daily user of chat GPT. I have only had great experiences! Planning meals on a budget, bedtime stories for my kids, stretch/exercise routine for my bad shoulder, and of course general advice/informantion/learning. 2 nights ago I was talking with Chat GPT about Kash Patel for some reason or another (Epstein files related I’m sure). Chat GPT then told me Christopher Wray was the current director of the fbi. I sat there for a minute and said “ummm, but he’s not, Kash Patel is the current director”. It continued to tell me I was wrong, saying he’s never even been a member of the FBI and was ten toes down that Wray was the director. I almost believed for a minute that I somehow was so confused in something I thought I was confident! I finally said “he became director just this year I believe” and it was like chat gpt snapped out of it and finally agreed. I asked why did it just tell me the wrong information twice, and it then sent me a checklist of why I may have been confused in thinking Patel was not the current director. I said I am not confused YOU were the only one telling me this! It then apologized and told me good call out. Is that not STRANGE to not “know” this common knowledge information, when I can GOOGLE Kash patels name, first thing it says is director of federal bureau of investigation?! Was this a glitch? Because I thought chat gpt pulled multiple sources of info from the web to give best answer? I’m not sure I can trust it and haven’t used it since.