menu
QAmmunity.org
Login
Register
My account
Edit my Profile
Private messages
My favorites
Register
Ask a Question
Questions
Unanswered
Tags
Categories
Ask a Question
What changes began happening for America once world war 2 was over
asked
Nov 10, 2019
24.9k
views
2
votes
What changes began happening for America once world war 2 was over
History
middle-school
TheDrifter
asked
by
TheDrifter
8.7k
points
answer
comment
share this
share
0 Comments
Please
log in
or
register
to add a comment.
Please
log in
or
register
to answer this question.
2
Answers
3
votes
The Red Scare and Cold Was came after WW2. Anti-Communist sentiments grew quite quickly which led to many witch hunts for communists in the government. Famous ones like Hollywood 10 and McCarthyism was popularized
Sardar Khan
answered
Nov 10, 2019
by
Sardar Khan
7.8k
points
ask related question
comment
share this
0 Comments
Please
log in
or
register
to add a comment.
1
vote
Economic prosperity was the best thing that happened after the war for the USA. Other changes included minorities trying to fight for their civil rights including African Americans, Latinos and women.
Eero
answered
Nov 16, 2019
by
Eero
8.0k
points
ask related question
comment
share this
0 Comments
Please
log in
or
register
to add a comment.
Ask a Question
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.
9.5m
questions
12.2m
answers
Other Questions
What goal of the constitution was also a goal of the Magna Carta?
is it true or false that after the american revolution conflicts in the northwest territory erupted between remaining british soldiers and native americans
Who made dutch claims in north america?
How did world war 1 affect the racial and ethnic makeup of american cities
What was an effect of nationalism in Europe in the early 1900s?
Twitter
WhatsApp
Facebook
Reddit
LinkedIn
Email
Link Copied!
Copy
Search QAmmunity.org