70.4k views
1 vote
How has education in the US changed and what influenced those shifts?

User Grigy
by
4.1k points

1 Answer

3 votes

Step-by-step explanation:

im not sure what exactly you're looking for but education has changed in the sense that it has become more inclusive for minorities and women. Education has also been becoming more focused on standardized testing and grades rather than traditional, qualified teachings.

User Eoja
by
4.5k points