495,719 views
0 votes
0 votes
U.S. men began

proving their
manliness by
How did Nationalism begin to gain greater influence across America?

User LaRae White
by
2.4k points

1 Answer

13 votes
13 votes

Answer:

[The French Revolution] helped introduce nationalism in Europe, for it changed France's entire system of government, defined citizens' rights, and developed a set of national symbols.

Explanation: After the United States entered World War I, nationalism surged. Americans enlisted in the military en masse, motivated by propaganda and war films.

User Parth Mehrotra
by
3.2k points