Answer & Explanation:
World War One changed the lives of America socially, politically and economically. The war had a massive impact on almost every aspect of society, particularly women, workers and minorities.
4.5m questions
5.9m answers