Answer: During World War 1, the United States went through social changes that changed the life of many African-Americans, immigrants, and women. These changes included more rights and jobs to many different men and women in America that would help change America into what it is today.
Explanation:
You might want to change up some words, because I got this answer online.