Over the last couple of days there seems to be an emerging theme among blogs I am reading... a perception among Americans that the world looks not as highly upon America as it once did. Or at least those who matter to American self-esteem... meaning, mostly, Europe... don't think as highly of America as they once did.
I'd like to share a thought on that, as someone who admires America but is not American.
"Why'd we go save their asses from Hitler...??" ... I've read or heard, so many times.
Allow me to be a little blunt about this... that conceit doesn't go over well.
America's home cities were never subjected to devastating air strikes. Its farmlands never churned up under the wheels of ravenous armies (since the Civil War...). Its minorities never taken away to death camps. Its civilians never raped on mass or subjected to reprisals.
Others got the broken end of that bottle. Britain and the rest of the Commonwealth were at war for over two years before Pearl Harbor. Furthermore it was not only America that swung the balance... it was also the Russians - who lost more people than anyone else, who were labouring under an oppressive regime of their own, and which front swallowed as much as three quarters of the German war machine. Enough with "we saved your asses".
America would NOT have liked the rest of the world sealed shut inside the Third Reich, Stalin's Soviet Union and the Japanese Empire. I would have been a new dark age for America as well.
But history rolls on and when you have half the world tied up in a nuclear stand-off with an adversarial economic system like Soviet-style communism, and you have near total economic hegemony, and things at home are pretty cool and carefree... you can assume the rest of world thinks you're as glamorous as you do.
When the Soviets are gone but you still have everything else, you look for farcical pseudo-threats like "trade wars".
But when there is no Cold War narrative left ongoing, the economy is shaky and you have an atrocity like 9/11 to contend with... "carefree" goes right out the window.
When the central narrative of American politics becomes "what is America becoming?", all those quaint foreigners who you helped in some war long ago suddenly have opinions that seem to matter, because where else are you going to look for comparisons?
When Obama was elected president the world didn't celebrate because he was a black guy. They celebrated because they believed he was better for America. That a McCain-Palin administration would have been a freak show and driven the US economy off a cliff. They still believe that. It's still true. It wasn't just the promise of what Obama might represent as the crawling dread of what the alternative would have meant.
Reactionary Americans, which takes in most of the Right-wing blogosphere, still don't give a damn what the anyone thinks. If the solutions don't fit some mythic idea of what America is, was, or is remembered (often wrongly) to be... then those solutions are of no use to them.
The rest of the world has not changed its opinion of America. It is still greatly admired for what it does well.
I'm starting to think though, that Americans have become more aware of what the rest of the world thinks.