Answer:
Step-by-step explanation:
When did education become important in America?
By the mid-1800s, a call for free, compulsory education had begun, and compulsory education became widespread by the end of the century. This was an important development, as children from all social classes could now receive a free, formal education.