181k views
24 votes
Is the United States of America an empire? If so, where is the empire, and how has it changed since 1900? If not, was it ever? If it was but no longer is an empire, when did it end, and what led to its demise? please historical example and underline the thesis statement because it would be an quick write.

User Fifix
by
3.7k points

1 Answer

8 votes
The USA is not called an empire and it wasn’t in empire
User Sturla
by
3.0k points