If you follow reddit posts on something pertaining to electricians, HVAC, auto mechanics, etc you will find guys posting there who have no idea about the subject. They just put the question into AI and post the results as if they know what they are talking about.

I have found ChatGTP to be excellent and accurate on some things like medical, or reading an owner's manual, but when discussing something that requires interpretation like the National Electrical Code it falls short. It seems to understand words, but not meaning. It cannot make decisions like what wiring method to use in a specific situation. It has similar problems with the other subjects I mentioned, but it does not say it does not know. It apparently reads a lot of similar material on different forums, and often comes up with something hilariously inaccurate.

Then there are guys who will pick up on what ChatGTP says and post it as if they are answering the question and make fools of themselves. I cannot imagine why they do this except that it might make them feel more masculine to answer questions on this type of subject. If you point out where they are wrong they will often double down on what they said. So ChatGTP is, in a manner, spreading misinformation.

I am sure ChatGTP will catch up, I have corrected it in the past and it has admitted it was wrong. Eventually it will be a good source for most anything. In the meantime, why would anyone use it to post answers to questions on a subject they have no knowledge of?

Leave a Reply

Your email address will not be published. Required fields are marked *