Answer:
(Answer is in explanation)
Step-by-step explanation:
It depends on what War you are talking about, but to really sum it up, I feel that Wars in general have evolved women. Thousands of women have become more independent and courageous from the Wars that have occurred. Its helped for things such as women rights and other related concepts. Hope this helps :)