182k views
1 vote
After WWI African Americans how did they get treated

User JSWilson
by
8.1k points

1 Answer

2 votes
Black people emerged from the war bloodied and scarred. Nevertheless, the war marked a turning point in their struggles for freedom and equal rights that would continue throughout the 20th century and into the 21st.
User Frbl
by
7.8k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.