menu
QAmmunity.org
Login
Register
My account
Edit my Profile
Private messages
My favorites
Ask a Question
Questions
Unanswered
Tags
Categories
Ask a Question
What did many americans come to realize about the united states as a result of world war II?
asked
Feb 15, 2016
140k
views
0
votes
What did many americans come to realize about the united states as a result of world war II?
History
high-school
Ryaner
asked
by
Ryaner
8.6k
points
answer
comment
share this
share
0 Comments
Please
log in
or
register
to add a comment.
Please
log in
or
register
to answer this question.
2
Answers
6
votes
Many Americans came to realize they were not satisfied with their old ways of life after World War II. They wanted something better. Millions of them moved out of cities and small towns to buy newly-built homes in the suburbs.
Martin Harris
answered
Feb 15, 2016
by
Martin Harris
7.7k
points
ask related question
comment
share this
0 Comments
Please
log in
or
register
to add a comment.
4
votes
They came to realize that the United States had a lot of power, because of the invention of the atomic bomb that killed tons of people in Japan.
Rostan
answered
Feb 20, 2016
by
Rostan
8.6k
points
ask related question
comment
share this
0 Comments
Please
log in
or
register
to add a comment.
← Prev Question
Next Question →
No related questions found
Ask a Question
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.
9.5m
questions
12.2m
answers
Other Questions
is it true or false that after the american revolution conflicts in the northwest territory erupted between remaining british soldiers and native americans
Who made dutch claims in north america?
How did world war 1 affect the racial and ethnic makeup of american cities
What was an effect of nationalism in Europe in the early 1900s?
What were the positive and negative effects of Egypt being imperialized by Britain.
Twitter
WhatsApp
Facebook
Reddit
LinkedIn
Email
Link Copied!
Copy
Search QAmmunity.org