First off, for the record I'm no right wing American lover.
However, can America ever be right? It seems that no matter what the yanks do these days they'll be criticised for it. If they use force they're policing the world and if they ignore wrongdoing they're the stuck up western pigs who only care about themselves. Damned if they do, damned if they don't.
What can America do to improve its global reputation without being further criticised?