219k views
3 votes
How did the political role of american women change in the years after world war 1?

1 Answer

5 votes
They were able to vote they had more freedom
User Antonio Laguna
by
8.1k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.