At the dawn of World War I, the United States was only a rising power. Our reputation was relatively benign among Middle Easterners, who saw no imperial ambitions in our presence and were grateful for the educational and philanthropic services Americans provided. Yet by September 11, 2001, everything had changed. The United States had now become the unquestioned target of those bent on attacking the West for its perceived offenses against Islam. How and why did this transformation come about?
"Hulk Smash! Hulk sorry :("