165k views
3 votes
Nursing is a Woman's Job
Please discuss


2 Answers

3 votes

Answer:

yes nursing is a women job

User Dalvinder Singh
by
4.5k points
4 votes
Nursing is a woman’s job because they have motherly instincts, they are also kind and caring... most nurses are female, some are guys though... they also care for those

Hope this helped ♥︎
User Sergey Rybalkin
by
4.8k points