You could argue both ways
Step-by-step explanation:
In my opinion WW2 had more positive impacts bc WW2 is what got the US and other countries out of the Great Depression that happened in the 1930s. More women were able to work in the workplace as many men were drafted into the military. Propaganda in the US also increased nationalism. Everyone seemed united to defeat the "enemy". Then, into the 1950s the US was very prosperous. The economy rose and people were able to enjoy themselves. Music like Rock and roll was enjoyed by the youth and there was televison shows like "Father Knows Best" and "Leave It to Beaver".