Originally Posted by kwg020
Even if the USA had nothing to do with it, we have been branded as the bad guys. This will lead to no where good.

kwg

where has the US done any good in the world the past 60 years?

Which war/invasion was justified?

Why is the US so hated in the world?