291,614 views
35 votes
35 votes
Is the United States of America an empire? If so, where is the empire, and how has it changed since 1900? If not, was it ever? If it was but no longer is an empire, when did it end, and what led to its demise? please historical evidence

User Jmborr
by
2.8k points

1 Answer

22 votes
22 votes

Answer:

Because the United States does not seek to control territory or govern the overseas citizens of the empire, we are an indirect empire, to be sure, but an empire nonetheless.

User Nekomimi
by
3.0k points