249,670 views
28 votes
28 votes
What changes occurred in america during the WWII

User David Folkner
by
3.2k points

1 Answer

22 votes
22 votes

Answer:

during ww2, the united states wanted to try their best to not get involved in the war in Europe by Nazi Germany. however, they were attacked by imperial japan by the bombing of pearl harbor in hawaii and this forced the united states to join the war as the allies to fight against the axis powers. productions and economy began to explode, the demands of weapons and armory increased, making the people in the usa who had no jobs that have jobs to work for and the effects of depression was slowly disappearing.

Explanation:

User Mkn
by
3.3k points