184k views
3 votes
How did World War II change the United States’ position in world affairs?

User Sequoya
by
7.7k points

2 Answers

3 votes
in WWII the americans helped form the UN which is an international governing system that sets world laws on war. After developing this group of nations tjey formed together to destroy the german military.
User Asad Shah
by
7.4k points
4 votes

At the end of World War II, the United States became the true winner. Its territory was not touched by the war and its economy continued a remarkable expansionary rhythm, since its war industry became an industry of peace stimulated by the internal demand and by the order of merchandise destined to the Marshall Plan.

Thus, the United States became the richest state in the world, with 7% of the world population consuming 45% of its wealth.

In the war the United States not only developed its economic, military and nuclear power, but also accumulated symbolic and strategic power after proclaiming itself as a defender of democracy and freedom. The most important effect of this was that it managed to erect a consensus among the American population that was the effective and necessary social base of the new imperial and hegemonic power of the capitalist world.

User Houbie
by
7.8k points