122k views
4 votes
How did women’s roles in countries such as the United States and Britain change after World War I? Check all that apply.

A Society became more open, and women experienced greater freedom.
B Society became more rigid, and women experienced more restrictions.
C Women were unable to win the right to vote.
D Women began to seek out new careers.
E Women challenged old traditions by doing things such as changing their clothing style.

User Ruanny
by
6.2k points

2 Answers

5 votes

Answer:

A,D,E

Step-by-step explanation:

I'm smart

User Gopherine
by
6.1k points
4 votes

The correct answers are A, D, and E for future reference.

User Mgaughan
by
6.1k points