108k views
4 votes
More women are graduating from college and snagging more knowledge-based jobs.

2 Answers

7 votes

Answer:

Is that a question or a opinion?

User Modern Labs
by
3.2k points
10 votes
Yes, women in the U.S. have earned more degrees for decades.
User VasFou
by
3.3k points