108k views
4 votes
More women are graduating from college and snagging more knowledge-based jobs.

2 Answers

7 votes

Answer:

Is that a question or a opinion?

User Modern Labs
by
8.7k points
10 votes
Yes, women in the U.S. have earned more degrees for decades.
User VasFou
by
7.2k points