193k views
4 votes
World War II transformed the United States in many ways. What do you believe was the most important way in which the war changed America?

User Poshi
by
5.1k points

1 Answer

5 votes

Answer:

America got it's Independence

User Keeney
by
4.8k points