Answer:
In my opinion, the most significant impact that World War 1 had on American society was women's rights. This is because while the men were off and fighting the women of America had to step up and take factory jobs or other jobs men would do. Women realized, wow we can do this! That is why two years later women got the right to vote.