Answer:
World War I Impacted American society by changing and improving the roles of women in the U.S and a new found use of propaganda.
Step-by-step explanation:
I Hope This Helps
7.2m questions
9.7m answers