Does America have an Empire?
This question was inspired by Unamed Harald, who answered my previous question about the British Empire.
No actually America is an empire.We as citizens are often blinded by dishonest representations of the facts and are for the most part blinded by our own good intentions and values.The majority of Americans have values and standards that make it seem impossible that someone would blatantly be so unjust.From the view point of the world we are seen as the great deceptive tyrant that kills whoever we please when and where ever we please.
If you look closely there has been only one other society that resembles the US and that is the Roman Empire.Like the Romans we will fall from corruption and dissent that come from within.
It depends on how you define 'empire'.
If you count the fact that US has conquered and displaced the Native American nations like the Chippewa and Cherokee, and others, the fact that Detroit was taken from the British and New Orleans was taken from the French, plus the territories, then yes we could say we have an empire in the US.
Yet, the US has resisted taking on more territories through military victories, like the US-Mexican War, or victories over Japan and Germany, So, if you consider our lack of territories through military victories then no, the US would not be a traditional imperial model.
Not in the tradition sense, which is to say not in the same way as the British empire or the Persian empire, but on a scale of global influence, and a military a risk of being overstretched, it is.
Does America have an empire? Historical examples of empires and why America is not one. read more
by James Kenny2 years ago
If WW1 and WW2 had never taken place, would the British Empire still exist today?
by Jennings17 months ago
Why does America give minority scholarships?Why does America give minority subjects, but tells us that white people shouldn't think better of themselves because of their skin color? Whites deserve just as much of an...
by Stephen J Lawson2 years ago
Poor people are black, black people are poor. Why does America believe this?
by Ian Stuart Robertson3 years ago
Does America have eccentrics?Apart from in Batman movies!
by VampKisses7 years ago
Does America really need to support the relief effort in Haiti? Have we ever been as generous to...other countries disasters?
by P.Brown5 years ago
Why does America have to be the big brother of the world?
Copyright © 2018 HubPages Inc. and respective owners.
Other product and company names shown may be trademarks of their respective owners.
HubPages® is a registered Service Mark of HubPages, Inc.
HubPages and Hubbers (authors) may earn revenue on this page based on affiliate relationships and advertisements with partners including Amazon, Google, and others.