148,769 views
23 votes
23 votes
At the end of WWI what type of policy did the US have regarding involvement with foreign countries?

User Jamele
by
3.3k points

1 Answer

11 votes
11 votes

Answer:

The United States of America played an important role in world war two. During this

time period, they changed their foreign policy during the conflict and fought with the

Allies against Germany, Italy and Japan. Right before and at the beginning of the

conflict, the US foreign policy was isolationist. After Germany conquered Poland and

France and Great Britain declared war, the United States remained neutral and did not

want to be involved in this war because they judged that it was not their war.

Step-by-step explanation:

User Gglasses
by
4.0k points