Answer:
World War I Impacted American society by changing and improving the roles of women in the U.S and a new found use of propaganda.
Step-by-step explanation:
I Hope This Helps
5.1m questions
6.7m answers