13.3k views
5 votes
What is the belief that women should have economic, political and social equality with men?

User Ccallendar
by
8.4k points

1 Answer

6 votes
Feminism is the term you're looking for. Although it has been achieved already, people are still refusing to accept Feminism is not required within the United States anymore.
User Pooria
by
7.9k points