Answer:
The United States of America played an important role in world war two. During this
time period, they changed their foreign policy during the conflict and fought with the
Allies against Germany, Italy and Japan. Right before and at the beginning of the
conflict, the US foreign policy was isolationist. After Germany conquered Poland and
France and Great Britain declared war, the United States remained neutral and did not
want to be involved in this war because they judged that it was not their war.
Step-by-step explanation: