Answer:
The civil war changed the lives of many women. The war gave more responsibility to the women. During and after the war the women took on many jobs and roles that they perform prior to the war. Women had to take on the role as doctors and nurses to help care for the soldiers. Most of the woman by the end of the war had become nurses and doctors. This was a big change for women as they had never previously worked in this area. Not long after the war this became a very popular occupation among women, and still is today. This shows that the Civil War affected not only women from the war, but also the women of today.
Sorry if this isn't what you wanted or if you dont like it. <3
Step-by-step explanation: