219,742 views
1 vote
1 vote
Discussion Topic

World War II was one of the most significant events of the 1900s and one of the
most important events in US history. Think about how much the United States
changed between the Great Depression and the postwar era, when the country
had become an economic powerhouse. How did World War influence and
change the identity of the United States throughout the 1900s and into the
present? What are some positive and negative changes that occurred in the
United States in the years after World War II?

User Henk Holterman
by
2.7k points

1 Answer

18 votes
18 votes

Answer: World War II helped influence United States by making the military power and making the axis powers know that they are a World power; this was a major threat to the Axis powers, Japan, Italy, and Germany. Positive effects was that the war created new jobs for the unemployed for the military. A negative impact was the war; the war caused a million deaths.

User Majik
by
2.7k points