The U. S. engaged in World War II from 1941 until 1945. All the Allied and Axis nations were affected by this war, in both positive and negative ways. Americans recovered from World War I only to face another devastating and costly war with Germany during World War II. The war specifically impacted American society in multiple ways: social, political and economical.