Final answer:
Initially, nurses were seen more as untrained servants but over time, particularly during and after the world wars, their role evolved into that of trained professionals. The expansion of educational opportunities and a growing demand in the labor market for healthcare professionals further elevated the status of nursing.
Step-by-step explanation:
The earliest references to nursing found in Exodus suggest that nurses were perceived more as untrained servants rather than as trained professionals. The nursing field has undergone significant transformations over the centuries. Initially, nursing roles were often informal and fulfilled by women who cared for the sick without formal training or official recognition. Over time, the perception and status of nurses evolved, particularly during wartime. During World Wars I and II, women served as nurses in large numbers, often close to the front lines, signaling a shift towards recognizing nursing as a critical profession for healthcare delivery. The post-war era saw growth in educational opportunities for women, leading to an increase in trained professionals entering the workforce, including in nursing.
In the early 20th century, as mandatory school attendance laws were enforced, more women entered teaching and nursing professions with better training and education. By the 1920s, women were graduating from college in equal numbers to men, and this educated workforce contributed to the professionalization of nursing. The transformation of nursing from its perception as an informal, domestic role to a respected profession was also influenced by the labor market demand for healthcare professionals, further elevating the status and importance of nurses in society.