68.8k views
5 votes
What did America, France, and England want Germany to become?

User FredSuvn
by
8.5k points

1 Answer

6 votes

Final answer:

Post-World War II, America, France, and England aimed to see Germany become a unified, strong nation to aid Europe's recovery and act as a defense against communism, leading to the establishment of West Germany.

Step-by-step explanation:

After World War II, America, France, and England wanted Germany to become a strong and unified nation, fulfilling multiple strategic and economic objectives. For the United States, a strong Germany was important for Europe's economic recovery and as a bulwark against communism. The British and French shared this goal, each for their strategic reasons, with hopes of stabilizing the region and countering the spread of Soviet influence. Reunification of the German zones occupied by these countries was viewed as essential to creating a powerful ally in the heart of Europe. This was formalized by the eventual merging of the three Western zones, which contrasted sharply with the Soviet-occupied zone in the east, ultimately leading to the emergence of two German states: West Germany (Federal Republic of Germany) and East Germany (German Democratic Republic).

User Meetnick
by
8.7k points

No related questions found