90.6k views
5 votes
What bad changes happened in the USA after world war 2

2 Answers

6 votes
After the war ended in the summer of 1949, soldiers had began to come home to their families. Overtime, industry's had begun to stop the making of war equipment. It was more peaceful. And the economy was stronger than ever!
User Williamsandonz
by
7.2k points
5 votes
The aftermath of World War II was the beginning of an era defined by the decline of the old great powers and the rise of two superpowers: the Soviet Union (USSR) and the United States of America (USA), creating a bipolar world.
User Gazler
by
7.7k points