Answer & Explanation:
World War One changed the lives of America socially, politically and economically. The war had a massive impact on almost every aspect of society, particularly women, workers and minorities.
7.4m questions
9.9m answers