I'm talking about movies made by American film companies that take place in America (or unambiguous setting, or fantasy/Sci-Fi setting) that were not filmed there. My roommate keeps talking about how movies filmed in America about America for Americans are just better than anywhere else in the world. I just want examples of those types of movies that just weren't filmed here.
Edit: Thanks, I knew that most films were filmed else where I just wanted specifics, so I didn't sound as dumb when I made my case.