Personally, I assume that’s the norm for every country to talk up their contributions in history. However, it was suggested that we are somehow censored and we are taught that we are the ones that ended WWII. That’s not my experience. Now, considering that education and curriculum is determined geographically, for now, I can imagine there’s some places that do teach US history is superior to all viewpoint. Man, not my school. Nor do I think that’s something that is really taught today. But it’s clear that there are places in the world that definitely teach anti American history. Hell, in another thread about Andy Wang of all people it was brought up that he teaches in China and the American text books used were censored with black tape.
And how this whole thing started was a European poster suggesting that the US was in no way responsible for the nazis being defeated. After all, the people who fought the war told them that. It’s just strange.