6.0k views
1 vote
U.S. men began

proving their
manliness by
How did Nationalism begin to gain greater influence across America?

User Mvinayakam
by
2.9k points

1 Answer

5 votes

Answer:

[The French Revolution] helped introduce nationalism in Europe, for it changed France's entire system of government, defined citizens' rights, and developed a set of national symbols.

Explanation: After the United States entered World War I, nationalism surged. Americans enlisted in the military en masse, motivated by propaganda and war films.

User Lab Bhattacharjee
by
3.6k points