Gemini 3 Pro, although great for coding and stuff, for everything else, especially decision making, it seems to be very submissive. It's behaving like a yes-man, and is too afraid to say no to me or offend me.
It's playing too safe, and that's leading to answers I WANT to hear but not what I NEED to hear. It's so frustrating.
Has anybody figured out a global instruction to prevent this? Last time I tried a prompt, it was being too much of an a**hole, constantly trying to prove me wrong, so that went just as well.