198k views
3 votes
How has European arrival in America changed America forever?

1 Answer

2 votes
Well, we kicked out almost everybody who was originally here like a bunch of arrogant meanies then we formed it into a place where every race other than white is oppressed. And then we have always stuck with the idea that women are weaker than men which isn't true. So America is a place where white men live freely and happily while everybody else is kicked off to the side.
User Gerlando
by
9.1k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.