We are the United States of America. We are America. Do these statements carry any credibility anymore? What do they even mean? Do they mean one is proud, intelligent, and fit? We wish. This utopian view of the average American died with the '50s. So again I ask: what does it mean to be an American?
Taking view from the foreign standpoint, we see more and more aggressive behavior towards Americans worldwide. We are no longer the liberators we once were; we have become the occupation force of the world. And as is patterned in history, occupation forces always fail.
Zoning in on the Middle East we notice that there are little to no levels of American sympathy. Can we even call the war in the Middle East a war anymore? A war is a conflict where two sides fight for their ideals. America's ideal at the time was to take revenge on an invisible enemy and gain oil, the insurgents' motives were thousands of years of Christian-Muslim conflict which has become naught but a habit to them.
Have we in turn lost our place in the world? Our nation is a shell of its once all-mighty influencing position and has fallen to the wayside of history like the Greeks and Romans.
To be an American now means to be hated by everyone, to be a bully, and to partake in silly politics. While our nation crumbles and dies, the very people who have sworn to protect it are busy squabbling over minor things such as their inflated paychecks and their corporate financiers.
When at school we say the Pledge of Allegiance, I refrain. I do not commit myself to the routine of "Place your right hand over your heart, ready? Begin," because there is nothing left to pledge to. Out country is hollow and shallow. I am not proud to live in the United States, but I am proud to be an American and the differences should be known.