This is True. Although there are many controversies around this question and the scope of the influence the war had, it is undeniable that WWI had influenced the American Society after 1917. The war reflected existing political and social divisions within American society during the twenties and thirties, as Americans differed on whether the war’s impact should be celebrated. American scholars are investigating the wide range of American responses to the war.