Answer:
World War I Impacted American society by changing and improving the roles of women in the U.S and a new found use of propaganda.
Step-by-step explanation:
I Hope This Helps
7.9m questions
10.6m answers