0 Comments
Answer:
I'd say the positive effects that World War 2 had on American society was that people started having more faith in themselves, they started believing and reaching for new goals. They started thinking in a more hopeful and positive way.
9.4m questions
12.2m answers