75.9k views
1 vote
How did the Civil war change what it means to be American ?

pleaz help !

2 Answers

4 votes
The Civil war? oh well thats easy. It gained us more freedom and rights, leading to the creations of advanced technology, which changed America forever, and yet still improves today! 
User Mohsen
by
6.1k points
4 votes
I'm late but oh well. The civil war brought forth unity in America and we were less separated by North and South. It made it so that Americans are not just one color of skin, Americans can be whatever color they'd like to be. Being American did not have to mean entitlement to treating other people poorly because it was always done that way. It showed that over time, as a nation, we could change together for the greater good.
User Wizard Of Kneup
by
6.0k points