Is the West's (liberal) cultural imperialism an ironic remnant of its Christian history?

TheLastEmperorReurns

Banned
Banned
Joined
Feb 21, 2017
Messages
6,308
Reaction score
355
Western imperial states would conquer other territories and push Christianity down onto the conquered masses.

Today, America and its closest developed allies force Western progressive values on different cultures.

Laws concerning homosexuality, treatment of animals, gender issues, etc.

Russia and China aren't concerned with forcing their values on other countries because they don't have the same history.
 
Maybe but most of us don't give a shit either way
 
America doesn’t “force” anything referenced in the op on other countries. It’s not like we bomb them until they comply. We do, however, impose trade sanctions on countries whose political systems or laws we have deemed cruel or oppressive, which we should. We don’t have to trade with them. They have no inherent right to trade with us. It’s the same way you would stop eating at McDonald’s if they were supporting ISIS.

And before you say the UN forces them, or we force them through the UN, they opt into the UN. Being part of a global community and agreeing to abide by the social standards it sets is voluntary.

The only cultural imperialism we engage in is bombing the shit out of middle eastern countries if their dictators aren’t friendly with us.
 
Back
Top