133k views
23 votes
Is the United States of America an empire? If so, where is the empire, and how has it changed since 1900? If not, was it ever? If it was but no longer is an empire, when did it end, and what led to its demise? please historical evidence

User Ckarras
by
3.7k points

1 Answer

10 votes

Answer:

Because the United States does not seek to control territory or govern the overseas citizens of the empire, we are an indirect empire, to be sure, but an empire nonetheless.

User AlenBer
by
5.3k points