31.3k views
3 votes
How did world war i change the role of government in the united states?

1 Answer

4 votes

Answer:

It produced a greater relationship between the government and private industry. The United States had no major battles or attacks on its soil.

Step-by-step explanation:

Hope this helps!

User Tobias Lorenz
by
7.8k points

No related questions found