154k views
5 votes
At what university did American sociology in the United States emerge?

a. University of Chicago
b. Harvard University
c. Stanford University
d. Yale University

User Ran Cohen
by
7.4k points

1 Answer

3 votes

Final answer:

American sociology in the United States emerged at the University of Chicago.

Step-by-step explanation:

American sociology in the United States emerged at the University of Chicago. In the early 1900s, sociologists at the University of Chicago, such as William Sumner, Franklin Giddings, and Albion Small, played important roles in the development of sociology as an academic discipline in the United States. These early American sociologists tested and applied European theories, and they became leaders in social research.

The University of Chicago's Department of Sociology became known for its contributions to urban sociology and the study of cities.

User Ivanov Maksim
by
8.2k points