179k views
1 vote
What is the meaning of this line? Before the war , Americans tended to say "The United States are," but after the war, they said "the United States is."

2 Answers

3 votes

Answer:

it means when they finish a war they are true Americans fighting for there country

User Siddharth
by
4.9k points
2 votes

Answer:

they under estimated the united states before they have seen its potential

User VAndrei
by
4.7k points