182k views
1 vote
After WWI African Americans how did they get treated

User JSWilson
by
5.8k points

1 Answer

2 votes
Black people emerged from the war bloodied and scarred. Nevertheless, the war marked a turning point in their struggles for freedom and equal rights that would continue throughout the 20th century and into the 21st.
User Frbl
by
5.6k points