235k views
3 votes
how did womens roles in countries such as the united states and Britain change after world war 1 check all that apply

User Ebenezer
by
5.2k points

1 Answer

5 votes

Answer:

Society became more open, and women experienced greater freedom.

Women began to seek out new careers.

Women challenged old traditions by doing things such as changing their clothing style.

User Colin Anthony
by
5.4k points