The social and cultural changes that took place in the United States were good for the country. They helped to promote civil rights, gender equality, and the acceptance of different lifestyles. As a result, people had more opportunities and freedoms, and the society became more inclusive and welcoming.