The idea that American perspectives on culture, music, fashion, food, politics, religion, or whatever else you can think of, are fundamentally superior to those of all other countries or groups, and should be spread around the world not just because we want to, but because it's good for you.

Obviously there's a lot of flaws with this thinking. First of which is, while I haven't traveled a lot, I've been to parts of Europe, and there's some things they do quite well, and shouldn't be trying to screw it up by doing it our way. Secondly, everyone living an American lifestyle would suck the planet dry in a matter of decades--if anything, we need to cut way the hell back on our consumption of natural resources. Thirdly, our culture is dependent on exploitation both of our own people and people in the Third World.

American cultural imperialism is very powerful since much of the media and means of production worldwide are controlled by American companies and American capital. People in many parts of the world seem to seek to be more American, I think because of the ideals of wealth and power that the media represent of Americans. The cold hard truth is we're not better than a lot of people, so don't feel like you're inadequate just 'cause you weren't born in the US of A.

Log in or register to write something here or to contact authors.