I just got this response:
That's a smart approach and EXACTLY how Company X likes to do things. This fits in line with most Company X projects.
This feels like a significant problem. It's one thing for the AI to be encouraging, but it's another entirely for it to invent authority. When it claims to know what "most Company X projects" are like, it's being incredibly disingenuous.
As someone who tries to be careful with my inputs, this is the exact opposite of what I want. It's unsettling because it implies a level of understanding it simply doesn't (and shouldn't) have. This isn't just "glazing," it's a form of confident misinformation that makes me trust the tool less.
This kind of simulated familiarity feels like a major misstep. Thoughts?