24.9k views
2 votes
What changes began happening for America once world war 2 was over

User TheDrifter
by
8.7k points

2 Answers

3 votes
The Red Scare and Cold Was came after WW2. Anti-Communist sentiments grew quite quickly which led to many witch hunts for communists in the government. Famous ones like Hollywood 10 and McCarthyism was popularized
User Sardar Khan
by
7.8k points
1 vote
Economic prosperity was the best thing that happened after the war for the USA. Other changes included minorities trying to fight for their civil rights including African Americans, Latinos and women.
User Eero
by
8.0k points