108k views
4 votes
More women are graduating from college and snagging more knowledge-based jobs.

2 Answers

7 votes

Answer:

Is that a question or a opinion?

User Modern Labs
by
9.3k points
10 votes
Yes, women in the U.S. have earned more degrees for decades.
User VasFou
by
7.9k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.