165k views
3 votes
Nursing is a Woman's Job
Please discuss


2 Answers

3 votes

Answer:

yes nursing is a women job

User Dalvinder Singh
by
6.8k points
4 votes
Nursing is a woman’s job because they have motherly instincts, they are also kind and caring... most nurses are female, some are guys though... they also care for those

Hope this helped ♥︎
User Sergey Rybalkin
by
6.8k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.