mainly Britain, France, US won WW1.
May 5, 2005 - Russia and its Western allies in World War II both tend to regard themselves as the country that defeated Hitler.
5.7m questions
7.4m answers