209k views
4 votes
America has been justified in its imperialist expansion in the 1900s? Why or why not?

1 Answer

4 votes

Answer: Whatever its origins, American imperialism experienced its pinnacle from the late 1800s through the years following World War II. During this “Age of Imperialism,” the United States exerted political, social, and economic control over countries such as the Philippines, Cuba, Germany, Austria, Korea, and Japan.

Step-by-step explanation:

User MoxxiManagarm
by
4.6k points